<![CDATA[ Latest from PC Gamer UK in Hardware ]]> https://www.pcgamer.com 2025-06-05T09:45:32Z en <![CDATA[ Apparently there's going to be a Sam Altman movie about his Lazarus-like career stumble, which I'm sure I'll reluctantly watch on a plane at some point ]]> The big tech biopic is a genre showing no signs of slowing down. Between 2010's The Social Network and the surprisingly solid 2023 BlackBerry, I've certainly enjoyed a few fictionalised humblings—though some would argue that the former Fincher flick can't help but print the legend. At any rate, Amazon MGM Studios are currently developing a movie based on Sam Altman's infamous firing and rehiring at OpenAI in 2023.

With a screenplay written by Simon Rich, and Challengers' Luca Guadagnino in talks to direct, filming for 'Artificial' could begin imminently (via The Hollywood Reporter). Though Guadagnino has yet to sign on the dotted line, the production is considering shooting across both San Francisco and Italy later this Summer.

Still, that leaves at least one big question unanswered: Namely, who will play Altman himself? While this has also yet to be confirmed, an assembly of actors including Andrew Garfield, Monica Barbaro, and Yura Borisov are discussing various casting opportunities in the meantime.

For those unfamiliar with the big tech farce Artificial hopes to adapt, let me give you the whistlestop tour: In November of 2023, Sam Altman was ousted as the CEO of the company he co-founded, with a statement at the time saying, "The board no longer has confidence in his ability to continue leading OpenAI." Mira Murati replaced Altman, leading the company as an interim CEO until a more permanent replacement could be chosen, though this was short-lived.

Not even a week later Altman's return was secured following both internal and external pressure from Microsoft, a major investor in the AI firm. The board members that ousted Altman departed the company soon after. Altman himself was not forthcoming about the reasons behind his firing, though an investigation by Reuters cites AI safety concerns as one reason. It will be interesting to see how Artificial dramatises this, to say nothing else.

Though I'm the last person that could be described as a 'fan' of OpenAI, I'm nonetheless intrigued by Artificial—especially if Guadagnino does sign on to direct. Challengers was arguably the best movie I saw in 2024, offering an intense blend of tennis, techno, and, uhm, trysts, that I've been really annoying about in conversations with my fellow film buffs for the last six months.


Best gaming PC: The top pre-built machines.
Best gaming laptop: Great devices for mobile gaming.

]]>
https://www.pcgamer.com/hardware/apparently-theres-going-to-be-a-sam-altman-movie-about-his-lazarus-like-career-stumble-which-im-sure-ill-reluctantly-watch-on-a-plane-at-some-point/ TdPucy4LyuQz4UaGRVP7QY Thu, 05 Jun 2025 09:45:32 +0000
<![CDATA[ CD Projekt Red and Epic claim Witcher 4 development 'ramped up dramatically' Unreal Engine's open world game capabilities ]]> We recently sat down with both CD Projekt Red and Epic to find out a little bit more about that, well, epic Witcher 4 tech demo and what the new Unreal Engine 5.6 brings to game development. The main takeaway is that their collaboration won't just make Witcher 4 an incredible open-world experience. It looks like it will help other developers literally up their game when it comes to open-world titles.

"This is exactly what we wanted to do in Unreal and have been talking about since the middle of Unreal Engine 4. Improving open-world development in Unreal is a very long conversation and you can go back to each and every release and see something in it that is guiding us towards even more excellence in open world development," says Wyeth Johnson, Epic's senior director of product strategy.

He then motions upwards vigorously to illustrate how much better the latest Unreal 5.6 engine is for open world game development. "We ramped up dramatically for this release and I think it shows," Johnson says, referring to the Witcher 4 demo that's been blowing everyone's mind, particularly because it's running on mere Sony PS5 hardware, not even the PS5 Pro.

Johnson also re-emphasized just how important performance is for Unreal Engine 5.6. "Our focus for 5.6 is almost exclusively performance," he says.

"Developers should be able to achieve incredible quality, and they get to define what that quality is, maybe it's the perfect pixel, or maybe it's incredible scale, or animation features or AI, or whatever the case may be," he says.

Epic interview

Epic's Wyeth Johnson (right) says Unreal 5.6's capability for open-world game development is dramatically higher. (Image credit: Future)

"Performance is not just what a player interacts with, it's also what a developer interacts with. And when the water level goes up and the time to achieve quality goes down, then you can be more expressive, you can be more playful, you can try to dream a little bit more than just worrying about all these systems holding you back. I would expect every developer who is trying to do something really original and creative in the Triple-A space, they're all going to be excited about this release, " Johnson says.

In other words, Unreal 5.6's performance optimizations should mean much higher quality visuals and gameplay on existing hardware.

Unreal 5.6 is also about making things easier for developers in the first place. "We need to bring the overall level of the engine down and keep the overhead as low as possible so that developers themselves can implement interesting behaviours and they can trust that those behaviours can go right in and players can experience them. Every developer who uses Unreal will benefit from what we showed today. That's the most fundamental take away," Johnson says.

As for CD Projekt Red, it seems like the transition from their own Red Engine to Unreal has been an unambiguous success. "We had a blast with Unreal," says Witcher 4 senior technical animator Julius Girbig.

"It's not like we lost things because of the transition. We're bringing our experience that we already have from Red Engine, all the Triple-A, open world with streaming, all that experience we're now bringing over to experts who have been building engines for years," he says of the collaboration with Epic.

"I'm a pretty surface-level animation artist, so I'm not actually digging into code that much," Girbig goes on to explain, "but I'm still able to run these hundreds of characters, create behaviours for them within the editor with this supportive UI, it just unlocks me as an artist.

"Now with this engine it suddenly unlocks me to create much bigger things and to think bigger. It's really awesome to be able to use things like the Nanite foliage to create these vast forests and whatever comes to your mind."

The end result is certainly impressive. Indeed, it's almost hard to believe that it's running on the Sony PS5, it looks that good. But Johnson confirms that it's all running on completely standard PS5 hardware.

"There are intrinsic things that the Playstation is amazing at, and we kind of show every single one of them in that technical demo," he says. Given the most spec of the PS5 compared to the latest PC hardware, that bodes well. Even AMD's new entry-level RDNA 4 GPU, the Radeon RX 9060 XT, blows the PS5 away for raw specs and especially ray tracing performance.

Unreal 5.6

Pre-baked fluid dynamics will allow realistic water without the frame-rate hit. (Image credit: CD Projekt Red)

Speaking of which, Johnson confirms that the demo was indeed using Unreal's Lumen ray tracing engine in hardware mode. If it runs well on the PS5, it should fly on any relatively recent PC graphics card.

Overall, the upside of CD Projekt Red and Epic working together is that both sides of the collab' benefit, and that means better games for all of us to play. "There's no possible way that we could push in every direction to the apex level that developers like CD Projekt Red are capable of pushing," Johnson says.

"We don't have any heartburn if somebody wants to make a modification to Unreal for their specific use and in fact that's a wonderful success for us, because it means we gave them a nice place to plug in the thing that they needed that we didn't provide."

The only frustration in all this is that Witcher 4 only started development late last year and isn't due out for a few more years at the very earliest, and likewise other games that benefit from all of CD Projekt Red and Epic's latest learnings aren't exactly around the corner.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/cd-projekt-red-and-epic-claim-witcher-4-development-ramped-up-dramatically-unreal-engines-open-world-game-capabilities/ XrnUkZ7h25JR6tgKvRTy4 Wed, 04 Jun 2025 16:37:24 +0000
<![CDATA[ This tech DIYer swapped their graphics card's cooling for a CPU cooler because why not? Also it actually works ]]>

If you've ever looked at that old graphics card sitting at the back of the shelf and thought, man, I wish that thing took up four PCIe slots and could dress up as a CPU, the good news is all you need is a 3D printer and a screwdriver.

As YouTuber TrashBench shows us (via a now-removed Reddit post), with just such tools, you can put a CPU cooler on a GPU instead of its normal cooling setup and actually improve your thermals.

TrashBench unscrews the shroud and cooler and first attempts to affix the Cooler Master heatsink and fan to the GPU with zip ties, but this "looked dumb, ran hot, and nearly rattled itself apart."

The graphics card in question was a GTX 960 (what a card!), and in 3DMark Fire Strike it reached 40 °C with the default cooler and shroud. After being stripped of this default setup and zip tie jerry-rigged with a CPU cooler, it hit 50 °C, so not a great result.

So, the tech YouTuber used a 3D printer to make a "proper bracket", ie, one that fits the screw holes already there in the PCB, and used it to screw in and mount the CPU cooler on top, with thermal paste betwixt the two, of course. The result was 28 °C, a resounding success.

a CPU cooler on a graphics card in a test bench in front of a computer screen

(Image credit: TrashBench at YouTube)

This solution isn't very practical, of course, if you have, well, just about anything below the GPU. The cooler shoots down to the bottom of the motherboard.

I suppose it might fit in a case, but good luck working with anything on the motherboard below the graphics card.

Still, it's a little surprising how simple it is to pull off something like this these days—3D printers really have done the DIY world a service. And if you don't have one of those but do happen to have some zip ties (and if you don't mind worse thermal performance), you can always try that method.

The zip tie cooler idea is new to me, but my colleague Andy said he and his step-brother used to do the very same thing with graphics cards of times of yore. Why not? It half makes me tempted to dig the ol' GTX 1070 out of the cupboard.


Best AIO cooler for CPUs: Keep your chip chill.
Best air cooler for CPUs: Classic, quiet cooling.
Best PC fans: Quiet and efficient.

]]>
https://www.pcgamer.com/hardware/graphics-cards/this-tech-diyer-swapped-their-graphics-cards-cooling-for-a-cpu-cooler-because-why-not-also-it-actually-works/ hKd7YEBmdgMpQguDqPXQfU Wed, 04 Jun 2025 15:48:33 +0000
<![CDATA[ I built a real SteamOS Steam Machine out of the guts of an old laptop so Gabe doesn't have to go through that whole sad dance again ]]> I'm reminded of the time I met Gabe Newell in Las Vegas. Now, if I never type another word the story would seem far cooler than it is. Sadly it was just another CES trip, way back in the January of 2014; a simpler time where people thought the future might be a positive place. Ah, how naïve we once were.

I'd been invited to a relatively small suite at the top of the Palms hotel and casino, just off the Strip, where Valve and its partners were demonstrating Steam Machines for the first time. Remember those little PCs—determinedly not Windows-based—promising to deliver the living room console experience for PC gamers?

Gabe delivered the presentation from a few feet in front of me, spoke enthusiastically about the future of Linux-powered game boxes, and then let us gathered journos loose to prod a selection of different Steam Machines and play with some prototype Steam Controllers within SteamOS.

And maybe a handful of words passed between Gabe and I during the event. Probably something like "I hope the quality of these prototype controllers carry on to the final units" or "you better hurry up and get SteamOS finished or else Alienware is going to do something crazy like launch its Steam Machine with Windows."

Or maybe it was a somewhat less erudite: "Hello Mr. Newell. Um, I like your games."

It was a long time ago, and the memory… ah, she is a fallible mistress.

Gabe Newell unveiling Steam Machines at CES in 2014

But what I do remember is the utter failure of the whole endeavour as SteamOS itself faltered before it got out the gate and the Steam Machines themselves just became wee Windows boxes, before disappearing altogether. It was all rather sad, after an unveiling that promised an OS which "combines the rock-solid architecture of Linux with a gaming experience built for the big screen" we ended up with delay upon delay and what was an exciting moment for gamers and developers ended up with some very late launches and some unhappy partners with Steam Machine chassis they couldn't shift.

Sure, the controller itself got out there, but subjectively speaking, it sucked.

You could say, however, that Gabe was playing the long game. Every detail of the unfortunate Steam Machine debacle led into the Steam Deck; arguably the true embodiment of what Valve was originally going for and surely the most successful piece of hardware the company has ever produced.

At its heart is a fully realised third-generation version of SteamOS, a distribution built on top of Arch Linux, and such a hit it has been that since the Deck released back in 2022 people have been desperate to get the operating system installed onto their other PCs. Despite promises that getting a standalone version out there has been "very high on our list" we're only just starting to see a sort of version you can sort of jam onto your other devices.

In fairness, you've been able to try that out for a while, grabbing the recovery install and see if your other hardware will take, but it's only now, 12 years after the first promise of SteamOS was made, there is actual baked-in support for other devices in the most recent 3.7 versions.

Lenovo Legion Go S with SteamOS installed showing an update screen

(Image credit: Future)

Jacob has stuck it on his Legion Go S and, of course I've made a Steam Machine with genuine SteamOS. No Bazzite-ing here. This one's for you, Gabe.

As Valve has stated in the past, SteamOS is still only really here for other handheld devices, specifically supporting the Legion Go S and ROG Ally, and I've honestly had pretty limited success getting the operating system working with any other handheld in my care. I was hoping the Strix Point-based OneXFly F1 Pro would hook it, but the installation procedure falls over before it even really begins. And I didn't get much further with the OneXPlayer X1 tablet, either.

It's worked out better on the Ayaneo Kun, where I've actually got a mostly functional setup that lets me play games with almost the same level of control over the system that the AyaSpace app affords in Windows. Sadly, the speakers don't work—and no amount of command line playtime has changed that—but, worse, the sleep function is entirely non-existent. Basically making it pretty useless as a gaming handheld.

This is a perennial issue of mine with Linux. I recently stuck PopOS on a Blade 15 laptop for giggles and rapidly stopped laughing when I realised the machine would resolutely never sleep. There's always something with Linux and the myriad configs of PC hardware to fix, which is why tinkerers love it, why it's taking so long for Valve to get a standalone version of SteamOS out there, and why it may still not be the perfect pick for regular PC gamers wanting to jam something other than Windows onto their systems. At least not yet.

But the hardware in the Ayaneo handheld is essentially the same as my Framework 13 laptop with the Ryzen 7840U mainboard inside it, so I figured it ought to at least work as badly. Yet it works a treat. Once I'd unzipped the downloaded recovery file from Steam before I loaded it onto a USB stick via Rufus then I was installing SteamOS onto my laptop in no time.

And everything works.

Framework 13 laptop with SteamOS running on the screen

(Image credit: Future)

SteamOS is finally delivering on its 'screw Microsoft' promise.

The sound bursts from the speakers the moment it boots in and the sleep function is working exactly how it does on the Steam Deck, and with Decky Loader and SimpleDeckyTDP I can control the power of the thing. And that's exactly what I want. It makes for a very effective light-gaming laptop, with a Linux desktop there for any proper notebook work at the flip of a system setting.

But the joy of the Framework setup is that you can whip out the mainboard and just use it in Standalone mode to create your own mini PC. So long as you remember to put it in that mode via the BIOS before you rip it out of the laptop frame, that is. Yeah, I had a few stutters on the way to Steam Machine nirvana.

And so, with the assistance of the Cooler Master-made Framework case I now have a delightful mini PC that can plumb into a big screen living room TV and boot instantly into a gaming OS easily controllable via a joypad. We've finally got GeForce Now natively on SteamOS, too, so I've been playing Indiana Jones at 4K Ultra settings via this little Steam Machine, kicking back in my chair.

Like it or not, Windows has never made for a good experience in the living room, but SteamOS is finally delivering on its 'screw Microsoft' promise.

The best thing is that it's been surprisingly easy to get things working. Okay, it's third time lucky in terms of the products that I've tried to install SteamOS on, but it's been a no-fuss experience for the most part getting it running on the Framework mainboard. The installation went smoothly, it discovered all the necessary drivers, and Decky Loader and the TDP control are but a download and a command line away respectively.

I've been a little resistant to all the recent clamour of SteamOS now being available for everything—it most certainly is not—but if you can find a device that it does work on, it works brilliantly.

Certainly the Radeon 780M iGPU inside my wee slab of a PC isn't going to set any speed records natively, but it's more than capable of delivering good 1080p gaming performance, and when you're running permanently cabled into the mains there's no excuse not to run it at its 40 W maximum.

In all, my new Steam Machine is responsive, accessible, and surprisingly stable for a beta OS. Oh yeah, I'm running the latest beta version of the operating system, partly because I'm a glutton for punishment, but also because it now brings in keyboard shortcuts for the two SteamIOS side menus. That's certainly useful in both laptop and desktop modes, and 12 years later I'm on the way to becoming more of a convert to Gabe's thinking.

As long as everything with the final standalone release is so easy… and why wouldn't it be?

How to install SteamOS on other devices

Framework 13 laptop and Cooler Master Framework case, with I/O modules around them

(Image credit: Future)

The process I've gone through to get my li'l Steam Machine up and running is really straightforward, and also something you could easily replicate yourself. It wouldn't necessarily have to be with a Framework board and case, either, as there's a good chance a lot of AMD-based mini PCs will take the install, too.

The first step is to download the Steam Deck recovery file, this is the data that is used to reset the Steam Deck, but this page is also where Valve details the methods used to get SteamOS onto other devices. Though obviously washes its hands over responsibility for smooth operation on anything other than the Deck, Legion Go S, and ROG Ally.

The next thing you'll need to do is unzip the file you just downloaded. You can mount it on a boot drive as is, but I've had better success from unpacking it first.

Now, get yourself a USB drive you don't mind wiping—preferrably 8 GB+ as there's a lot of SteamOS—and then format it as FAT32.

And to mount the OS on there, you'll need Rufus. Download the program and boot it up, and then you can point it at your USB drive and the SteamOS repair file. Click Start to get the ball rolling and a handful of minutes later Rufus will have created you a bootable SteamOS installation drive.

Rufus with the SteamOS repair files

(Image credit: Future)

You will need to disable Secure Boot in the BIOS, too. This is a vital step.

Now, you will need to prep whatever device you're hoping to install to. The first thing I would suggest is that you find the most recent BIOS you can and make sure it's installed on your target device. For some handheld devices that might be the difference between functional sleep modes or not, because Modern Standby or the lack thereof is what's holding back my Ayaneo device right now, and that's only enabled via a firmware update.

You will also need to disable Secure Boot in the BIOS, too. This is a vital step in ensuring your device even gets close to installing the Valve OS.

And for me, I needed to enable Standalone mode in the Framework BIOS. There is a chassis intrusion sensor that we need to tell the mainboard to ignore if it's getting stuck inside anything else—like a wee case, or a power desk for example—and you can't access it once the board is outside of the Framework 13 laptop chassis.

For this project I actually ran the installation while the Framework board was still configured inside a laptop—I wasn't 100% confident the setup would be fine, and figured it would be a quicker operation with a pre-attached screen, keyboard, and touchpad.

But the process is essentially the same for any other device: stick the USB drive in a spare port on your machine, boot into the BIOS, and select your USB stick (partition 1) as the boot drive. From there on you'll hopefully get a screed of text cascading down the screen before you boot into a live OS to create your install.

SteamOS live recovery disk options

(Image credit: Future)

This is essentially the desktop SteamOS environment and there will be a selection of shortcuts at the top of the screen and you want to double-click the one marked Wipe Device and Install SteamOS. Obviously.

You'll need to confirm that you want to raze everything on your machine to the ground, and click Proceed. Then it will go through the process of installing the operating system for you and then it will reboot.

If you are lucky, and your system is content with the current state of SteamOS, then you will find yourself booting into a fresh new operating system.

The first thing to do here is to make sure you are running the latest version of the OS. Nip into the Settings menu, select System and check for software updates. Personally, at this point, I would look at the System Update Channel section and select Beta rather than Stable. Chances are you'll benefit more from quicker updates.

From here, the one thing I would make sure of is to have Decky Loader and SimpleDeckyTDP installed so you can control the power draw of your system. This is especially important if you're running something off a battery, but it also helps you control noise to a certain extent.

You'll need to jump into the desktop mode of SteamOS to do this, though. So bring up the Settings menu again, go into Power and Switch to Desktop. It will then boot into a relatively familiar Windows-style desktop environment.

SteamOS in Desktop mode

(Image credit: Future)

You may need to install FireFox so you have a functioning browser, but that will be accessible from the Applications button on the taskbar. Then navigate to the Decky Loader main page and hit download. Then bring up the file manager to access your downloads and double-click the file and select Execute to run it.

Getting the power control feature installed is a bit more Linux-y in that you'll need to hit the Konsole. But first you'll likely need a password so you can get things done in there. Hit the System Settings button, scroll down to Users and hit the Change Password button. You won't have one already, so just put in one now, and select not to change the password for other areas.

Now, navigate to the SimpleDeckyTDP github page and go to the install section. From here, boot the Konsole app and copy the install commands from the browser into the command line in Konsole. Hit enter, and it will start installing.

Once that's complete, reboot your machine into Game mode and you'll find Decky Loader and SimpleDeckyTDP at the foot of the secondary control menu. That's either Shift+CTRL+Tab on a keyboard or Xbox button+A on a controller.

Framework AMD mainboard running in a Cooler Master chassis, with SteamOS on it

(Image credit: Future)

From there, you're weapons-free to install and play games at will on your new machine. Have at it. For me, well once that was done I had to get the mainboard out of the laptop and installed into the little Cooler Master chassis.

Thankfully Framework makes that super easy, and there's a full guide on how to do it on its site. It was essentially just a question of installing the Wi-Fi cable routing, dropping in the PCB and hooking that and the 3.5 mm audio jack up to the board.

It's finally worth noting for anyone doing this that the HDMI display output needed to be on one of the top two I/O modules for it to output to our 4K Alienware OLED properly, and the same is true of the USB Type-C module, too.

]]>
https://www.pcgamer.com/hardware/i-built-a-real-steamos-steam-machine-out-of-the-guts-of-an-old-laptop-so-gabe-doesnt-have-to-go-through-that-whole-sad-dance-again/ wNGWgf22kDkvYncY8TDjW6 Wed, 04 Jun 2025 15:45:16 +0000
<![CDATA[ I just found out there's a server tossing championship, and it makes me want to try yeeting my own PC down the road ]]> When you can't sell or even give away your well-worn hardware, it's not unheard of for folks to get a little creative with how they get rid of their e-waste. There are a number of ways to dispose of old tech—though I would strongly recommend against shooting your old phone with an archery arrow, and I would also ask you not to ask me how I came into possession of this knowledge. Thankfully, I've just learnt about a hardware harming pastime I can happily shout about.

CloudFest is a cloud infrastructure conference that saw folks from all over the globe descend on Europa-Park in Rust, Germany earlier this year, and once again brought a little known underground sport to a wider audience: Server tossing.

Conference attendees of all ability levels and genders were welcome to apply to toss servers—the only necessary qualification requested was the "desire to throw a server really %&#*ing far."

The World Server Throwing Championship has been enjoyed by a spectating crowd at CloudFest for a couple of years now—and applications are already open for the US-based server tossing event coming up in November. The sport itself has a long and largely made-up history, but its most likely origin story can be traced back more than a decade to some disgruntled sys admins in Holland.

The first edition of the tournament as we know it today took place in 2011, giving rise to a compound word that makes me wish I spoke a little more Dutch: 'Serverwerpen'.

Archival footage of that 2011 competition looks almost quaint now, but compared to the sizzle reel for 2025's championship, there is one thing that stands out to me: namely, how close the crowd is on all sides in the footage from the more recent events. I'm sure spinning, shotput-style windups are disallowed to minimise the risk of a server toss that goes offpiste, but the lack of eye protection gear does still make me wince at every throw.

Thankfully, it appears most competitors and attendees come out of the event in far better shape than any of the servers. Competitors have two attempts to lob a server as far as they can, with the qualifying throws from 2025's event not just seeing some impressive air but record-breaking distances too.

According to this recap of the 2025 event, Linda Splitt placed first in the women's division by hurling a server a distance of 11.10 metres—an honour she also lists on her LinkedIn profile. In the men's division however, there was a major upset as Thom van Hal placed first with a throw of 13.18 metres, wrestling the previous record from Bartosz “The Beast” Wojciak who won last year.

Now I don't know about you, but I'm definitely not an IT professional immersed in the world of cloud infrastructure—nor am I especially gifted in the strength department. All the same, I'm still adding 'lob a server' to my career bucket list.


Best gaming PC: The top pre-built machines.
Best gaming laptop: Great devices for mobile gaming.

]]>
https://www.pcgamer.com/hardware/i-just-found-out-theres-a-server-tossing-championship-and-it-makes-me-want-to-try-yeeting-my-own-pc-down-the-road/ Dob4A9whrCgjEkJRjf7K7b Wed, 04 Jun 2025 15:15:20 +0000
<![CDATA[ Google search's AI overviews are awful, but here's a browser extension that gets rid of them ]]> Among the countless examples of the ever-burgeoning ens***ification of the internet, Google's AI-powered search overviews rank pretty highly. Verily, I pine for the days of reliable, organic search results devoid of AI slop.

But don't despair. Well, not entirely. For the editor-in-chief of our sister website, Tom's Hardware, has come up with a browser extension that gets rid of AI overviews from Google search results. Give it up for Avram Piltch and his Bye Bye, Google AI extension.

To quote the man himself, Avram says Google, "decided to push AI overviews and AI mode onto search users, regardless of the damage it causes to the user experience or the harm it may inflict on publishers and the entire open web."

He also points out that Google is rolling out AI Overviews to ever more territories and countries and fears that Google may eventually want to replace all organic search results with AI Overviews . His solution is the aforementioned Bye, Bye Google AI, which works in Chrome or Edge or any desktop browser that supports Chrome extensions. He's currently working on Firefox and Safari versions.

"The extension allows you to hide the AI Overview section from all of your queries and goes a step further, allowing you to hide other areas of the Google SERP that you may not want, such as the videos section, text ads, or 'People Also Ask,'" Avram says.

The latest 1.5 version now supports 19 languages: English, French, German, Spanish, Korean, Japanese, Mandarin (Trad + Simplified), Arabic, Hebrew, Urdu, Hindi, Thai, Greek, Italian, Polish, Russian, Dutch, Danish and Portuguese. You can also now hide the AI Mode tab, not just AI Overviews.

Avram also explains the other ways to kill the AI Overview, such as adding "-noai" to your search string. But if you want to permanently kill AI Overviews—or at least for as long as this extension works and Google is serving up any organic results at all—then Bye, Bye Google AI feels like a no brainer of an extension.

Of course, you could just use another search engine, like DuckDuckGo. Moreover, the seemingly falling quality of Google's organic results isn't fixed by this extension. But if you just want to remove a little AI slop from your daily interneting, then this could be the tool for you.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/google-searchs-ai-overviews-are-awful-but-heres-a-browser-extension-that-gets-rid-of-them/ S6TtdVm3Vn4smEYjh7GbhW Wed, 04 Jun 2025 14:53:31 +0000
<![CDATA[ I've been gaming with Razer's new wireless charging mouse pad and I'm glad there's finally serious competition for Logitech ]]> Mouse pads that charge your gaming mouse never quite took off in the way I expected them to. They made a grand entrance, then they lingered for a little while, and then seemed to fade away. Now, hot on the heels of Logitech's recent Mk.2 release, Razer has just launched its new HyperFlux V2 Wireless Charging System.

The charging pad, Razer explains, delivers "continuous wireless charging directly through the mouse mat, eliminating the need for charging cables or docks entirely." This, of course, means you won't have to do anything to charge your mouse at all, provided you keep it on the HyperFlux pad.

Not just any mouse, though—it has to be one that can fit Razer's charging puck into the bottom. It's the same charging puck slot that some Razer mice can use with the Dock Pro, so you get the same mouse compatibility as with that dock: Basilisk V3 Pro 35K, Basilisk V3 Pro, Cobra Pro, and Naga V2 Pro.

I've been using the HyperFlux V2 with a Basilisk V3 Pro 35K for the past few days and have been really enjoying my time with it. I've got the hard pad version (there's a cloth one, too), and the first thing that jumps out to me is actually nothing to do with the wireless charging.

It's just how much I've missed using a hard pad because boy is this thing smooth. I sent a quick video to my colleagues of me gently nudging my mouse across it and the general consensus was that it's a little like a puck on an air hockey table. If that slipperiness isn't your jam, of course you can go for the cloth version.

Top-down view of Razer HyperFlux V2 Charging Mouse System mouse pad with Basilisk V3 Pro 35K gaming mouse on top

(Image credit: Future)

Razer HyperFlux V2 vs Logitech G PowerPlay 2

I also spent a lot of time with the new Logitech G PowerPlay 2 charging mouse pad and scored that one a low 40% primarily because it's expensive, has just a flimsy cloth pad on top of the charging station, and most importantly doesn't include a wireless receiver (you have to plug in a separate dongle to connect your mouse to your PC).

I'm happy to say that the Razer HyperFlux V2 does have a wireless receiver built-in, so you won't need to plug in the mouse pad and the mouse, just the pad which the mouse connects to. Razer claims "seamless auto-pairing" for the HyperFlux V2, and this seemed to bear out. I just slotted the puck into the mouse and slapped it on top of the pad and it connected and worked.

This Razer pad also looks and feels premium in a way the new Logitech one doesn't. It's solid, has a nice bezel to the edges, is practically immovable when placed, and its receiver station zone at the top (whatever you want to call it) is in that iPhone camera island cut-out kind of style which looks rather nice, as does its LED which hints towards low/medium/high battery life.

Just like the PowerPlay 2, though, it might not be big enough for low sensitivity players. And unlike the PowerPlay 2, the mouse pad is attached, which presumably means no replacements. Although I don't see anywhere to actually get a replacement PowerPlay 2 cloth pad, either.

Razer HyperFlux V2 Wireless Charging System mouse pad with Basilisk V3 Pro 35K gaming mouse on top

(Image credit: Future)

I'm excited that we have some actual competition in this charging mouse pad space. What doesn't make me quite so excited is the price tag. We're looking at $120 for this thing, which is more than the PowerPlay 2. One of my main criticisms of the PowerPlay V2 was its price, and although that was in part in comparison to its predecessor, the fact is that even in a vacuum $120 is still a lot for a mouse pad, charging or no charging.

Perfect peripherals

(Image credit: Colorwave)

Best gaming mouse: the top rodents for gaming
Best gaming keyboard: your PC's best friend...
Best gaming headset: don't ignore in-game audio

I've yet to figure out exactly what I think of that irksome price tag, but I'll be mulling things over as I formulate my full review of the HyperFlux V2.

Razer has done a charging mouse pad before, but that was in the form of a Mamba HyperFlux combo, with both mouse and mouse pad. That tended to go for about $200-$250, and given the Mamba mouse alone went for close to $100, the price for this charging station doesn't seem to have shifted much.

Still, as I said, you are getting a very premium mouse pad here in terms of both looks and feel. And you're getting a built-in wireless receiver so you only have to plug in the mouse pad and not a dongle for the mouse, too. From a subjective perspective, I know that I get excited each morning when I remember I'll soon be using this Razer HyperFlux, and I didn't feel anything like that towards the PowerPlay 2.

For that experience, though, you're spending $20 on top of what you'd spend on the PowerPlay 2 that I reckon costs too much. And you're limited to just a few Razer mice, and not its best ones, at that. It's certainly new competition, but not the ideal competition I'd hoped for. Mixed feelings, here.

]]>
https://www.pcgamer.com/hardware/gaming-mice/ive-been-gaming-with-razers-new-wireless-charging-mouse-pad-and-im-glad-theres-finally-serious-competition-for-logitech/ jdCyCbTF7dvKvfyjkzgxFb Wed, 04 Jun 2025 14:29:02 +0000
<![CDATA[ DeepMind boss 'would pay thousands of dollars per month' to get rid of his email, so Google is working on a next-gen Gmail AI that will answer them in 'your style—and maybe make some of the easier decisions' for you ]]> My Gmail inbox currently has 84,685 unread emails. Were there an AI tool that could clear my backlog automatically, let's just say you could sign me up and take my money. It's the only way I'm going to achieve inbox hygiene (there's always the trashcan button - Ed.). Rejoice, then, because the head of Google DeepMind says they're working on a next-gen Gmail that can answer emails for you.

Yep, Demis Hassabis has been speaking at the SXSW London festival (via the Guardian) about all things AI. And apart from predicting the arrival of AGI or artificial general intelligence within five to 10 years, Hassabis revealed Google's plans for clearing the world's email backlog.

"I would love to get rid of my email. I would pay thousands of dollars per month to get rid of that," Hassabis said. More specifically, he said Google was planning, "something that would just understand what are the bread-and-butter emails, and answer in your style—and maybe make some of the easier decisions."

Such a technology raises all kinds of immediate questions. Such as, why not pay a human thousands of dollars per month to manage that for you. Like an executive assistant maybe? And exactly what kind of email will it answer? Will it actually save you much time if in practice you need to know it's answered for you and what exactly it said?

I mean, if your Gmail bot replies, "sure, I'm on it" or "thanks, understood," well, you'd need to be actually on it or have understood it, right? Likewise, do you trust any AI with your personal communications?

Slightly more dystopianly (if that's a word, and it ought to be, I sense we're going to need it, though it might need another 'n' or perhaps an extra 'l'), it also raises the prospect of bots spiralling off into endless back-and-forth comms.

After all, if you are a Gmail user who emails another Gmail user, you'd presumably have situations where you fire off an email, their Gmail bot replies, following which your Gmail bot decides to weigh in and they're off to the races, chatting away.

Somewhat ironically, Hassabis also said AI could be used to protect you from attention-grabbing AI algorithms. The idea is that an AI assistant might give you, "more time and maybe protect your attention from other algorithms trying to gain your attention. I think we can actually use AI in service of the individual."

So, that's Google's bots protecting you from the attention-grabbing efforts of other companies' bots. What a time to be alive that will be. Of course, we could just cut out the toxic attention grabbing nonsense in the first place.

Sorry, silly idea. Without the toxic algorithms, Google couldn't sell you a bot to protect you from its competitors, while those competitors presumably sell you another bot to protect you from Google's algorithms. This, people, is the future. And it's beautiful. Kinda.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/software/ai/deepmind-boss-would-pay-thousands-of-dollars-per-month-to-get-rid-of-his-email-so-google-is-working-on-a-next-gen-gmail-ai-that-will-answer-them-in-your-style-and-maybe-make-some-of-the-easier-decisions-for-you/ GWitYcyRdKM3AdXvgcKwF9 Wed, 04 Jun 2025 13:30:18 +0000
<![CDATA[ AMD Radeon RX 9060 XT 16 GB review (XFX Swift) ]]> The AMD Radeon RX 9060 XT is, I think it's fair to say, one of the most anticipated graphics cards of this generation. Gamers on a tight budget have had a tough time of it in recent years, and AMD's new entry-level card feels aggressively designed (and priced, no less) to hit Nvidia where it hurts.

At $299 for the 8 GB version and $349 for the 16 GB variant, the latter I have in front of me here, the little AMD card is not just looking to knock the $299 RTX 5060 off its precarious perch, but even aims to take a swing at the $429 RTX 5060 Ti. Everyone loves an underdog story, but in a GPU market plagued by inflated prices and less-than-impressive generational performance gains, the RX 9060 XT has its work cut out from the start.

So, what you're likely wondering is, does it give Nvidia's budget offerings a comprehensive floor-wiping? Not quite. But what AMD has come up with here is a cool, calm, and collected 1080p and 1440p performer for a very reasonable sum, and right now that feels like a cool breeze on a hot summer's day.

RX 9060 XT 16 GB verdict

The XFX Swift AMD Radeon RX 9060 XT on a set of bookshelves, with various sci-fi novels behind it

(Image credit: Future)
Buy if…

✅ You want bang for your buck: The RX 9060 XT might not be the fastest card on the market, but nothing touches it for its MSRP. Assuming that price translates to reality, of course, so cross your fingers.

✅ You want a budget upgrade for modern 1080p or 1440p gaming: Smooth 4K gaming is beyond even the fastest budget cards, but at lower resolutions the RX 9060 XT delivers great performance for the cash, particularly compared to previous generations.

Don't buy if...

❌ Money isn't much of an issue: Should you be feeling flush and don't mind spending a fair bit more, the RTX 5060 Ti will deliver faster performance overall—and much more overclocking potential.

❌ You want productivity performance: This is a gaming card, through and through, so if you're looking for productivity chops then Nvidia is the way to go.

I've had the XFX Swift version of the AMD Radeon RX 9060 XT sitting in one of our benchmarking rigs for the past week, and it's rather impressive. While it's not quite been able to beat out our RTX 5060 Ti sample overall (AMD's claims that it's 6% faster than the RTX 5060 Ti at 1440p haven't proved out in my testing), it often comes perilously close at both 1080p and 1440p resolutions for a full $80 less—and that's just the sort of thing that might cause Nvidia to drop the Ti's price down in response.

Competition is a good thing, y'see, and the RX 9060 XT provides just that.

And as for the $299 RTX 5060? Fuggedaboutit. While the Nvidia card is $50 less, the extra performance you receive from the 16 GB RX 9060 XT is more than worth the money, in my opinion.

The RX 9060 XT might not quite have the goods to outright beat the RTX 5060 Ti in many of our benchmarks, but it's so close to it it makes the non-Ti card look poorly-equipped by comparison. Not to mention, the 8 GB RX 9060 XT is the same cash as the RTX 5060 at MSRP.

That being said, I'll be very interested to see if the RX 9060 XT's MSRP proves out in practice. This generation of GPUs has been marred by low availability, ludicrous retailer mark ups, and a host of factors that have made it hard to recommend any card—given what it'll likely end up costing you when you plug your details in at the checkout.

Ultimately, though, I have to review the GPU in front of me, and I can say that it's a good 'un. It's also been remarkably stable, only finally sullying its 100% reliability record once I pushed those GDDR6 memory chips past their stated speeds. My particular XFX sample isn't much of an overclocker compared to the Palit RTX 5060 Ti, for sure, but I'd wager that even in 2025, most people are more concerned with the apples to apples performance you get from either card fresh from the box.

And in the RX 9060 XT's case? I'd say it's got enough grunt to make the RTX 5060 Ti worry. Time will tell if that pricing holds out, but should you be able to find one for MSRP, it's the new budget GPU I'd plump for if I was looking to save a penny or two. Money makes all the difference at this end of the market, and an $80 saving goes a long way towards some shiny new games, a slightly better CPU, or even just a good night out.

Great entry-level GPUs have been far too expensive for far too long, if you ask me. The RX 9060 XT, though? It might just be the turning of the tide. Fingers crossed, ey?

RX 9060 XT 16 GB specs

The XFX AMD Radeon RX 9060 XT on a set of bookshelves

(Image credit: Future)

When it comes to the specs, the little RX 9060 XT looks a lot like an RX 9070 XT, only halved. It's a Navi 44 variant built on TSMC's N4P process with 32 RDNA 4 Compute Units, 32 Ray Tracing Accelerators and 64 AI Accelerators, all of which matches with that basic equation—although with a stated 3.1 GHz boost frequency as standard, it certainly comes with a hefty dose of clock speed straight out of the box.

Those improved RDNA 4 CUs are key to AMD's recent catch up to Nvidia-like performance, in tandem with third generation RT Accelerators that mean the red team is no longer on the back foot when it comes to the increasingly-important ray tracing performance figures.

We're starting to see games like Doom: The Dark Ages require ray tracing capable (and ideally, performant) graphics hardware these days, and it looks like a trend that's likely to accelerate in years to come.

RX 9060 XT 16 GB

RX 9070 XT

RTX 5060 Ti 16 GB

Architecture

RDNA 4

RDNA 4

GB206

Transistor count

29.7 billion

53.9 billion

21.9 billion

Die size

199 mm²

357 mm²

181 mm²

Compute units/SMs

32

64

36

Ray accelerators

32

64

36

AI accelerators

64

128

144

Shader cores

2048

4096

4608

Boost clock

3130 MHz

2970 MHz

2512 MHz

ROPS

64

128

48

VRAM

16 GB GDDR6

16 GB GDDR6

16 GB GDDR7

Memory speed

20 Gbps

20 Gbps

28 Gbps

Memory bus

128-bit

256-bit

128-bit

PCIe interface

PCIe 5.0 x16

PCIe 5.0 x16

PCIe 5.0 x16

TGP

160 W

304 W

180 W

MSRP

$349

$549

$429

Memory-wise, both the 8 GB and 16 GB variants make use of GDDR6 connected to a 128-bit bus, with a total effective memory bandwidth of 320 GB/s. That's around 28% less memory bandwidth than you'll get from the RTX 5060 Ti with its speedy GDDR7 chips on board, albeit with the same bus size.

For the 16 GB cards that's probably going to be just about fine, but it might spell trouble for the 8 GB variant of the RX 9060 XT. The Nvidia GPUs with a paucity of VRAM may have a little more to give than the competing 8 GB AMD chip, but as we've yet to manage to get hold of the lower spec version that's all speculation for now.

Mention should also be made of those second generation AI accelerators, which allow the RX 9060 XT to take advantage of the latest, machine learning-enhanced iteration of AMD's upscaling tech, FSR 4. DLSS has long ranged ahead of FSR for sheer image quality thanks to its reliance on local, AI-capable hardware, and the RX 9000-series now has an equivalent of its own.

Multi Frame Generation has oft been touted as a reason to pick up an Nvidia card over the competition, but we've found it doesn't scale so well further down the stack due to increased latency. You can artificially boost the frame rate to gain some impressive figures, but the lower-end RTX 50-series cards show the limits of the tech when it comes to real world gaming experience.

AMD doesn't yet have an exact MFG equivalent, instead primarily relying on 2x Frame Gen as part of FSR 4—although major AI enhancements are said to be coming to AMD's tech in the Redstone update later this year. Regardless, at least when it comes to the bottom end of both company's current lineups, the AI frame rate-enhancing doohickeys look fairly evenly matched on paper.

All that being said, comparing AMD's efforts to Nvidia's with a specs sheet showdown doesn't reveal the performance differences between the two, nor what it's like to use one for real world gaming. The architectures are distinctly different, so it's in the benchmarks where we'll find whether the budget AMD card has the potential to give the significantly more expensive RTX 5060 Ti some serious trouble.

RX 9060 XT 16 GB performance and benchmarks

The XFX Swift AMD RX 9060 XT 16 GB in a gaming PC, lit up in blue, yellow and pink

(Image credit: Future)

For now, we've only received a 16 GB sample for review, so I can't tell you how the 8 GB variant performs. I have, however, primarily pitted the XFX card against the Palit RTX 5060 Ti Infinity 3 16 GB, with our MSI RTX 5060 8 GB sample thrown in for reference, alongside the RX 7700 XT from the previous generation. I've also dropped in the numbers from the XFX Swift Radeon RX 9070 OC, to give you an idea of what $200 extra (technically) gets you in the world of AMD GPUs right now.

To war, then. AMD's chief architect of gaming solutions, Frank Azor, has been keen to point out that the "majority of gamers are still playing at 1080p", and the RX 9060 XT is aiming for great 1080p and good 1440p performance. And, while the 16 GB variant I have on hand here seems designed to allay concerns of a lack of VRAM, it would be unrealistic to expect smooth 4K frame rates from such a budget offering, and that is certainly reflected in the benchmarks here.

At 1080p, however, the little XFX card puts on an impressive turn of speed. A mere two frames difference between the RX 9060 XT and the RTX 5060 Ti in both the Black Myth Wukong and Cyberpunk 2077 average results is pleasing to see, especially when you factor in the price difference between the two, and the proclivity for ray tracing in CDPR's game. In F1 24 the AMD card pulls five frames ahead on average, although the 1% minimum is a full 10 frames behind.

Total War: Warhammer 3 delivers the best overall result for the XT when taking into account both the average and minimum frames, which is something of a surprise given it's traditionally a more CPU-focussed benchmark, especially at 1080p. I had a play around outside of the benchmarking tool to see what it was like in actual gameplay, and the little AMD provided a reliably high minimum result when paired with the mighty Ryzen 7 9800X3D.

...the budget RX 9060 XT delivers figures that would have been decidedly mid-range for the previous generation

Overall, though, the AMD card is slightly behind on average. Claims that the RX 9060 XT is 6% faster than the RTX 5060 Ti haven't quite proven out in my particular testing, but it's close enough across most benchmarks to show the performance is very much comparable.

1440p is a similar story: Close, but not quite the full cigar when it comes to beating the RTX 5060 Ti overall. Still, F1 24 seems to be the AMD card's jam, as it once again ranges ahead of the Nvidia GPU by a reasonable margin. And then there's 4K where… yep, you guessed it, the AMD card is ever so slightly behind on average once more.

Not that smooth 4K performance was ever on the table for either GPU—but should you be a fan of crunchy gaming, the RTX 5060 Ti delivers slightly more frames. A close run thing, though, no doubt.

It's the real world performance chart where the value of the AMD card becomes apparent, though. With upscaling enabled at 1440p (and frame gen, where applicable) the budget RX 9060 XT delivers figures that would have been decidedly mid-range for the previous generation.

Take note of the Cyberpunk 2077 result, for example. Six frames ahead of the significantly-more-expensive RTX 5060 Ti is nothing to be sniffed at, and is comparable to the figures I recorded with the RX 7800 XT when I played through the game last year with similar settings, minus some of the ray tracing goodies.

That particular card is still what I'd consider a great 1440p GPU in 2025, so the fact that the new budget AMD offering can match it is mighty impressive. And as for the ray tracing performance? It's much better than its older, bigger brother. As gen-on-gen comparisons go, that's the sort of major improvement I can get behind.

And so, up and down we go. The RX 9060 XT is eight frames slower than the RTX 5060 Ti in Black Myth, a full 20 fps quicker in The Talos Principle 2, and eight frames slower on average in Homeworld 3. Pretty much the definition of trading blows, this particular chart, but still a good result for the AMD card when price is taken into consideration.

The real kicker, however, comes when you compare the $349 RX 9060 XT 16 GB to the $299 RTX 5060 8 GB. The AMD tiddler is roughly 14% faster on average at both 1080p and 1440p Native, and a whopping 23% faster in the real world upscaling and frame generation benchmarks.

I'll admit, that last percentage is significantly skewed by a staggering 190 fps 1440p result for the RX 9060 XT in F1 24 with the upscaling and frame generating goodies turned on. We saw similarly astonishing numbers in that particular game in our testing of the RX 9070 and RX 9070 XT. Whatever FSR is providing, this particular F1 racing game seems to love it. DRS, perhaps?

Anyway, while it might seem unfair to pit the 8 GB RTX 5060 against a $50 more expensive 16 GB competitor (although that VRAM difference is likely to only come into play for 4K gaming, which neither card is particularly good at), it's still pretty impressive what you get in terms of extra performance for the cash.

Where the RX 9060 XT truly lags behind, however, is in productivity performance. Not that I think that particularly matters in a budget gaming GPU, but should AI image generation be your thing, you'd be far better off with the Nvidia card.

The Blender result is also fairly disappointing, although it's far from the first AMD card we've seen drop significantly behind the competition in this particular benchmark. Rendering games? Great. Rendering for work? You'll be wanting something else.

Overclocking

While Nvidia's RTX 50-series GPUs are reliable overclockers, I'll admit to clenching my teeth when I push the AMD card above its usual speeds. With a 3.13 GHz boost clock speed as standard, it doesn't feel like there's a whole lot of headroom to play with—and the card as a whole feels like it's pushed fairly close to its limits straight out of the box.

However, I was able to achieve a 300 MHz overclock on both the chip and the memory with little effort using AMD Adrenaline's built-in tuning software, which works fairly well. Going much past this, however, results in some jarring hard locks, particularly when trying to eek a little more speed out of the memory.

The XFX Swift AMD Radeon RX 9060 XT in a PC Gamer benchmarking PC, lit up in RGB

(Image credit: Future)

What's also held me back from pushing the AMD card into the stratosphere is the coil whine. The XFX card runs virtually silently at stock speeds, but boost the frequencies and it gets very chatty, very quickly.

Call me a nervous nellie if you must, but when an overclocked component starts squealing under serious duress, I find it difficult to ignore. This is most evident in The Talos Principle 2, where the RX 9060 XT makes its displeasure audibly known at every opportunity when pushed beyond its specs sheet. Still, that could be a PSU-dependent thing, and is more of an observation of this particular setup than an outright critique.

My testing shows a decent two to three fps improvement from this relatively mild OC in most of our games, with the odd outlier result. That's enough for me to say that if that slight fps discrepancy between the stock RX 9060 XT and the RTX 5070 Ti really does bug you, a little light overclocking can help to close the gap.

However, I would point out that the overclocked Nvidia GPU is capable of going a fair bit further—although your mileage will likely vary from card to card as to what stable clocks and performance you can eventually achieve.

Our Dave found that the RX 9070 and RX 9070 XT both benefit from a spot of undervolting, but it's no bueno on this particular card. Even the mildest -60 Mv undervolt causes insta-crashes in most tests, even with a 10% tickle of extra wattage to play with. I'm no expert overclocker, but I'd say the RX 9060 XT has had most of the juice tweaked out of it by AMD to begin with.

Which makes sense. This card is a little late to the party, and I would wager that's because AMD was keen to see what the RTX 5060 Ti was capable of before it committed to final tuning. It feels like a card that's been gussied up to near its maximum in order to give Nvidia a headache right out of the box, so those looking for an overclocking wonder would be better off elsewhere.

PC Gamer test platform
Supplied by Cyberpower | MSI

CPU: AMD Ryzen 7 9800X3D | Motherboard: MSI MPG X870E Edge Ti WiFi | RAM: Kingston Fury Beast RGB 32 GB (2 x 16 GB) @ 6,000 MT/s | Cooler: MAG CoreLiquid i360 White | SSD: Spatium M480 Pro 2 TB | PSU: MPG A1000GS PCIe 5 | Case: MAG Pano 100R White

RX 9060 XT 16 GB analysis

The XFX Swift AMD Radeon RX 9060 XT 16 GB on a bookshelf

(Image credit: Future)

I've had my ear to the ground over the past few months, gauging the reaction of gamers to AMD's new GPU lineup, and I know expectations have been high. After all, the RX 9070 XT manages to give the RTX 5070 Ti a proper run for its money, and many have been hoping that the RX 9060 XT would do the same for the RTX 5060 Ti, too.

And although the AMD card is slightly behind on average in many of our gaming benchmarks compared to the Ti, I still think it achieves its goals—excellent 1080p performance, good 1440p figures, low power draw and cool temperatures, all for a significantly cheaper MSRP than its direct competition.

What's absolutely key here is pricing and availability, and that's a hard thing to judge at this point

Which brings me back to money once more. What's absolutely key here is pricing and availability, and that's a hard thing to judge at this point. While I want to believe that, as AMD claims, the RX 9060 XT will be widely available for its stated price come launch day, we've all been disappointed before.

That being said, a look at our best graphics card deals page reveals multiple 8 GB RTX 5060 and RTX 5060 Ti cards at MSRP or less, and a (relatively) small mark up on the 16 GB Ti variant. The 16 GB RX 9060 XT makes a whole lot of sense if it can maintain its $80 cheaper MSRP than the 16 GB Ti in particular, but should it prove popular (and I'm willing to bet it will) and retailers start to even out the price difference between the two, you're looking at a whole different recommendation.

I've been told by an AMD representative that the XFX Swift 16 GB model I've reviewed here has a recommended price of £315 in the UK. That's an encouraging thought, but again, what it ends up listing for on retailer websites remains up for debate at the time of writing.

As things stand, though, the AMD card seems destined to become the new budget darling of this generation. As someone who regularly recommends PC builds for this very website, I know how tough it can be to spec out a budget gaming rig in 2025, and how every penny counts when it comes to maximising the bang for your respective buck.

And while I wouldn't call the RX 9060 XT an exciting card, what it is is something more tangible. It's a workhorse, a reliable, cheap, chuck-in GPU willing to do some serious work in the graphics mines for much less than its main competition.

In short, it's exactly what we've been waiting for at the lower end of the market, and for that, it's deserving of some serious praise. It might not be the quickest card in its segment, but it gets darn close for significantly less.

]]>
https://www.pcgamer.com/hardware/graphics-cards/amd-radeon-rx-9060-xt-16-gb-review-xfx-swift/ A8CZQyucNgtcXyNSm36xHk Wed, 04 Jun 2025 13:01:00 +0000
<![CDATA[ Spot the connector: Gigabyte's new graphics card design lets you hide your power cable even if you don't have a back-connect motherboard ]]> One of my favourite trends in gaming hardware of late is new ways of tucking away all those cables for a sleeker build. Now it looks like Gigabyte's joining the party with its own method for maintaining a minimalist look, as its latest graphics card has a hidden power connector.

The Gigabyte Aorus RTX 5090 Stealth Ice is a graphics card with a standard power connector—none of this back-connect motherboard malarkey—but said connector is hidden around the back of the card. The 12V-2x6 connector plugs into the PCB right behind the heat fins, as you can see from a cut-out on the backplate. This would presumably be more cumbersome (if not impossible) if it were using a smattering of standard 8-pin connectors.

This graphics card is part of the company's Project Stealth, which is primarily a back connector project that moves headers/connectors around to the rear of the motherboard just like MSI's back-connect design.

Unlike Asus's BTF design (the graphics cards for which are now starting to become compatible with non-BTF motherboards too), Gigabyte's Stealth motherboards don't deliver GPU power through the motherboard. This new Stealth GPU should therefore make for a good pairing with its Stealth lineup, because in lieu of this it sneaks the power cable towards pass-throughs to keep it hidden.

One benefit of this is that it won't require the purchase of an entirely new motherboard to keep those cables hidden. It's a great option for those who are looking to upgrade their GPU but already have a standard motherboard or a back-connect one that doesn't do motherboard GPU power delivery.

Gigabyte Aorus RTX 5090 Stealth Ice graphics card and box on a plain background

(Image credit: Gigabyte)

Gigabyte explains: "This new graphics card inherits the minimalist philosophy of Project Stealth by adopting a hidden power connector design, allowing seamless cable routing and a cleaner build layout. Tailored for enthusiasts who prioritize aesthetic refinement and efficient assembly, the card offers both cutting-edge performance and a streamlined look that elevates any gaming setup."

Apart from this stealthy addition, the graphics card design looks similar to the standard Gigabyte RTX 5090 Aorus Ice, which is a blocky all-white affair.

We're seeing plenty of focus on cable management and other similar build and design improvements in PC hardware land this year, which is nice to see. I was quite impressed with Lian Li's Rotation PSU that was shown at Computex last month, for instance, which allows for more control over where you route your power supply cables.

Gigabyte's Stealth GPU design is just one amongst many moves in the direction of clean-looking builds, and I'm certainly here for it.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/graphics-cards/spot-the-connector-gigabytes-new-graphics-card-design-lets-you-hide-your-power-cable-even-if-you-dont-have-a-back-connect-motherboard/ dYJs3ecoh9WkqFjwSbyufU Wed, 04 Jun 2025 12:40:03 +0000
<![CDATA[ This incredible truly wireless desktop PC build even keeps your coffee hot forever ]]>

If you thought wireless power delivery was handy for charging phones but not terribly practical for much else, prepare to have your mind blown. One of our favourite YouTube channels, DIY Perks, has just dropped a new video showcasing a new fully 3D wireless power delivery technology that looks, well, revolutionary.

Imagine a desktop that can power just about anything wirelessly. There's no need to align anything carefully with a charging pad. You don't even need to be resting on the surface, the power can be transmitted to devices positioned a foot or more above the desk surface. That's exactly what DIY Perk's truly wireless power desk is capable of.

The basic principles here are similar to existing wireless charging pads as typically used with smartphones. Those use a coil to generate a magnetic field that flips in polarity thousands of times a second.

If you place another coil nearby, an electric current is induced. The problem is that the secondary coil has to be positioned very close and be quite precisely aligned.

But this new 3D wireless power system can cover a very large area defined by the perimeter of a single length of wire. The 3D wireless power hardware in the video is provided by Etherdyne Technologies and a full evaluation kit as used by DIY Perks is available through their website, though the cost isn't clear.

3D power

The wireless power delivery extends well beyond the surface top. (Image credit: DIY Perks)

The kit is claimed to be both FCC and CE certified and can deliver up to 100 W of power. The evaluation unit is composed of a control box, power supply, plus the field-generating wire and has been designed to be embedded within a desktop measuring around two feet by four feet.

The wire generates a magnetic field in the same way as a charging pad but flips polarity much more rapidly, millions of times per second. The technology also relies on being precisely tuned to resonate at this frequency.

The result is a "dome" of 3D wireless power above the desk surface. Exactly how far above the desk the power can be transmitted isn't stated, but going by the DIY Perks video, the power dome extends well over a foot above the desk surface, at minimum.

For this demo, DIY Perks built a PC based on a super-slim Frameworks laptop mainboard into the underneath of desk surface. Up top, there's everything from a keyboard and mouse, Bluetooth speakers, a microphone and even a monitor, all powered wirelessly.

The monitor also runs via a wireless HDMI video interface, enabling a truly wireless solution with a totally cable-free screen that can be slid around the surface. DIY Perks' final flourish is a wirelessly heated coffee mug that keeps fluids at a constant 70 degrees C. Nice.

3D wireless power

There's even a perpetual mug that keeps coffee hot forever... (Image credit: DIY Perks)

At this point you may be wondering why you haven't heard of all this before and what the catch must be. Well, the first downside is power efficiency. DIY Perks doesn't go into great detail on this subject, but points out that the system isn't 100% efficient and consumes 10 W at idle.

Our understanding is that overall efficiency is probably in the 70 to 80% region at best. Then again, if you're powering it all with renewables like solar, some losses arguably aren't the end of the world. Overall power delivery is another obvious limitation. That 100 W figure, we believe, is a total for all devices being powered.

3D wireless power

The PC itself is based on a Framework laptop mainboard built into the underside of the desktop. (Image credit: DIY Perks)

For sure, the largest power receiver coil in the kit delivers just 7 W. For a desktop solution like this, the main power draw is the PC itself which can run on wired power. DIY Perks says it has used a pair of receivers, so presumably that's two times 7 W.

For sure, it's not a large display and probably isn't very bright. A typical PC monitor probably isn't compatible with the evaluation kit as currently specced—and you can forget about a power-hungry OLED panel.

3D wireless power

The final build includes speakers, microphone and monitor, though there are power limitations that may make this kind of setup impractical in practice. (Image credit: DIY Perks)

But then this is early days for 3D wireless power and more, well, powerful solutions may be possible. As for other concerns around safety, interference with other devices, health implications and so on, well, that's outside our expertise. But as a tech demo this is all very impressive.

Heck, just having what you might call a perpetual wireless keyboard and mouse would be nice, let alone a truly wireless monitor and a cup of coffee that never, ever gets cold.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/this-incredible-truly-wireless-desktop-pc-build-even-keeps-your-coffee-hot-forever/ kvDmZES6gz4XTqiHnMijV6 Wed, 04 Jun 2025 12:23:09 +0000
<![CDATA[ Linux enjoys a small usage uptick with Steam users, though at 2.69% it still has a long way to go to topple Windows ]]> It's that time again where we look to Steam's monthly survey and all feel slightly better about our own rigs. Just for starters, I may not be able to afford the latest Nvidia card, but I can still feel a wee bit smug that neither you nor I fall into the 37% of Steam users still rocking Windows 10, right? …Right?

Operating systems that are about to become an unsupported vintage aside, the real story from the May edition of Steam's hardware and software survey is that there's been a slight uptick in folks moving to Linux—by about 0.42%.

Okay, so it's hardly like there's been a massive exodus from Windows as the end of 10 approaches, but the fact that 2.69% of Steam users are running a Linux OS is worth drilling down into.

For one thing, SteamOS itself is Linux-based. For another, SteamOS is gradually rolling out compatibility for handhelds beyond the Steam Deck, such as the Lenovo Go S and other AMD devices, potentially explaining some of this increase.

Steam has even added a compatibility rating for third-party devices to its store pages, at least suggesting a long-term commitment to bringing SteamOS to a wide selection of devices.

Lenovo Legion Go S with SteamOS installed showing an update screen

(Image credit: Future)

On the hardware team, our Jacob recently got SteamOS running on his Legion Go S and hasn't looked back. However, as he notes in his feature, we've enjoyed less success getting SteamOS to run on other devices with similar internals, like the Framework 13 Strix Point laptop.

While still doable, it hasn't been all that functional in Dave's recent experience. Dave has, however, managed to get a SteamOS laptop up and running using the Framework's older 7840U mainboard.

Though 2.69% is a bit of a high as far as OS share goes, I'm not going to pretend this is Linux's tipping point. Besides the fact that Windows still enjoys a whopping 95.45% total share of the OS pie in Steam's May survey, Valve's ultimate ambitions for SteamOS aren't really about replacing Windows.

Still, if you've got a third-party handheld gaming PC, you can try your own hand at installing SteamOS using Steam's own handy guide. Alternatively, you could attempt to install Windows on your Steam Deck, though our Dave would likely judge you for it.

It's worth noting that the Steam survey is a little, let's just say, all over the place. It's a good tool for a general idea of trends, but view the results with a pinch of salt. For example, I doubt there's been a sudden wave of new, dual-core CPUs coming online in the past month, but lo and behold, Steam registers a 0.12% uptick.

Linux usage around the 2% mark does at least appear to be a fairly reliable stat month-to-month: Linux usage was at 2.27% in April, slightly down on the 2.33% noted in March, but broadly above the 1.55% of users registered in February.


Best handheld gaming PC: What's the best travel buddy?
Steam Deck OLED review: Our verdict on Valve's handheld.
Best Steam Deck accessories: Get decked out.

]]>
https://www.pcgamer.com/hardware/handheld-gaming-pcs/linux-enjoys-a-small-usage-uptick-with-steam-users-though-at-2-69-percent-it-still-has-a-long-way-to-go-to-topple-windows/ AMod6QMrGX9erKh2UbqR2i Wed, 04 Jun 2025 10:49:21 +0000
<![CDATA[ TSMC boss claims the chipmaker doesn't need to pick winners to work with, just wait patiently 'because they will all come to us in the end' ]]> Yesterday, TSMC held a shareholder meeting, and in an interview following this the company showed every sign of confidence in its future, regardless of the ebbing and flowing of different AI companies that are some of its biggest customers. The world's biggest semiconductor company even went as far as to say about AI companies: "We don't need to judge who will win, because they will all come to us in the end."

That eerily ominous machine translated quote, brought to you by Taiwan Economic Daily (via Wccftech), comes from Wei Zhejia (C. C. Wei), CEO and chairman of TSMC. It comes off the back of of Wei pointing out that both GPUs and ASICs are made at TSMC, implying that whatever kind of AI data centre compute you need, TSMC has you covered.

And it's true, TSMC does produce most of the compute for AI data centres, with Nvidia's various Hopper and Blackwell chips being the most obvious ones and the ones that AI companies seem to be lining up for.

Yesterday we covered how TSMC said at this TSMC shareholder meeting that "AI demand has always been very strong and it's consistently outpacing supply" despite US tariffs. The company did say that tariffs have an impact, though.

With a new $100 billion planned investment in the US and an Arizona fab capacity that's already apparently sold out through 2027, it doesn't look like TSMC has much to worry about on the international trade war front, either.

TSMC

(Image credit: Taiwanese Semiconductor Manufacturing Co.)

I suppose that with production based both in Taiwan and in the US, it's not too ridiculous to assume that "they will all come to us in the end."

Wei reportedly admits that the company did face overcapacity in the recent past, but "this time we are more careful and thorough than before." This seems to be in part because TSMC is getting forecasts and info from chip-scale packaging manufacturers (CSPs) as "they are also worried that we are not prepared enough."

It must be difficult to predict demand in such a new and booming market as AI, but if any company can be sure to do well—other than Nvidia, of course—it does seem to be TSMC.

Let's just hope we see some of that sweet chip revenue trickle down into the laps of we humble gamers in the form of some derivative gaming chips and GPUs. Hey, I'm not above scooping up the scraps, are you?


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/processors/tsmc-boss-claims-the-chipmaker-doesnt-need-to-pick-winners-to-work-with-just-wait-patiently-because-they-will-all-come-to-us-in-the-end/ vSn4gf98cP5dft9qFTGGxG Wed, 04 Jun 2025 09:58:14 +0000
<![CDATA[ 'Widespread theft': The UK government's fifth attempt to push through a bill allowing AI companies to scrape any data they like shut down by the House of Lords ]]> AI doesn't care about copyright. It can't, obviously, because it's AI, and not a human being with thoughts and feelings. But it particularly doesn't care about copyright, partly because the US and UK governments are both pushing to let AI companies scrape data for use in generated text and imagery without a care thrown to the digital wind.

The House of Lords has just denied the UK government's fourth attempt to pass a bill to let it scrape whatever data it likes, and the counter to that is merely a push for transparency over which data is scraped.

The point of this counter from the House of Lords is to allow copyright holders the means to license or protect their information from what effectively amounts to stealing. Those invested in the growth of AI are, predictably, not a fan of paying artists for their work.

Sir Nick Clegg, the former president of global affairs at Meta, and previous deputy prime minister of the United Kingdom, said the House of Lords' wish would "Kill the AI industry in this country", according to the BBC.

The Data (Use and Access) Bill, as suggested by the UK government, argues that AI developers should have access to all the content they like without any consideration towards those who create the data, unless those creators specifically opt out of data collection.

Baroness Beeban Kidron, a member of the House of Lords, has argued that the "state sanctioned theft" of data would be "throwing UK designers, artists, authors, musicians, media and nascent AI companies under the bus".

LONDON, ENGLAND - APRIL 29: Baroness Beeban Kidron, speaks during a discussion of AI and copyright at The Palace of Westminster on April 29, 2025 in London, England. Parliamentarians and representatives of the Artificial Intelligence and creative industries, including Björn Ulvaeus of Abba, attended the discussion. Earlier this year, the British government held a consultation on proposed changes to UK copyright law with respect to the training of AI models. Other participants were representatives of the creative sector, the News Media Association, Make it Fair campaign, and UKAI, the trade association for AI businesses across the UK.

(Image credit: Getty Images / Carl Court)

The most recent amendment to the AI bill argues, "The regulations must require specified business data to be published by the trader or the data holder so as to provide copyright owners with information regarding the text and data used in the pre-training, training, fine-tuning and retrieval-augmented generation in the AI model, or any other data input to the AI model."

According to the BBC, the standstill between the House of Lords and the British government is "uncharted territory" with neither budging on their stance on AI and the arts. Personally, I find the push for AI companies to scrape data unless specifically told otherwise rather telling.

Many artists will not be aware of their rights around AI and will not opt out purely because they haven't been adequately informed. Artists being asked to opt in feels more equitable, but from my conversation with artists, I have a feeling very few would choose to do so. The British government is likely looking to get as much as data as possible to these AI companies, artists be damned.

Similarly, it has been suggested many times that untraining data from AI models is impossible, which means that once the data is scraped, there's no going back anyways. The change for better transparency won through a vote of 242 to 116.

The bill has now been pushed back to the House of Commons, where it could be further discussed this week. This means a fifth push for the bill is likely on the cards.

The argument used by the UK government is similar to that recently used by OpenAI, the owner of ChatGPT. In March, OpenAI appealed to the US government, stating that if it doesn't let OpenAI scrap copyrighted content, America would lose to China.

AI, explained

OpenAI logo displayed on a phone screen and ChatGPT website displayed on a laptop screen are seen in this illustration photo taken in Krakow, Poland on December 5, 2022.

(Image credit: Jakub Porzycki/NurPhoto via Getty Images)

What is artificial general intelligence?: We dive into the lingo of AI and what the terms actually mean.

Just months before that, OpenAI complained that DeepSeek, a Chinese AI model, stole data from it, which was data OpenAI scraped from others in the first place.

Recently, President Donald Trump introduced the One Big Beautiful Bill act (or OBBB), which attempts to mandate that states could not regulate AI for a decade, among other things, like increased tax cuts and further support for Border Patrol and ICE.

The AI section of this bill has seen criticism from even large Trump supporters, like Republican Marjorie Taylor Greene, for how wide-reaching it is.

]]>
https://www.pcgamer.com/software/ai/the-uk-house-of-lords-denies-the-governments-ai-bill-for-state-sanctioned-theft-of-copyrighted-data-for-the-fourth-time/ rrWAFVi5HtHAQ26z8b7VJC Wed, 04 Jun 2025 09:42:45 +0000
<![CDATA[ The new Arctis Nova 3 looks like SteelSeries is min-maxing the midranged headset with both hardware and software ]]> SteelSeries has announced the latest in its lineup of gaming headsets with the The Arctis Nova 3 Wireless Series. It comes in two variations, the 3P for PlayStation, and the X3 for Xbox, but both work with PC and Switch The Arctis Nova 3 is a bit of a step down from some of the very impressive headsets we've reviewed like the Arctis Nova Pro, but comes with a price to match. It looks like the Nova 3 series is all about bringing whatever SteelSeries can manage from those higher-end headsets into something more folks can afford.

But the first exciting thing we need to address on these headsets are the colours. Without even needing a collab, the Arctis Nova 3 line of headsets come in black, white (eh,) aqua (oooh), and lavender (aaaaah). They haven't skimped on the colour either, with each headset featuring an allover paintjob, including the recessed SteelSeries logo on the side, and even the detachable microphone.

Aside from the colour offerings, these headsets sport custom Neodymium Magnetic Drivers drivers, with up to 40 hours of battery life. They work with both Bluetooth and a USB-C dongle for high-speed 2.4GHz wireless, and you can quickly swap between wired devices. Plus there's optimised fast charging that boasts 90 minutes of play time after only a 15 minute charge. All packed into, what looks like a fairly comfortable set that weighs only 260g. This actually seems like a great deal in a headset that's only a little over $100 USD.

But according to SteelSeries, it's not just the hardware but the software that makes this headset worth buying. It pairs with the new Arctis App, which is free on both the Google Play and Apple App stores. This app lets you do a little more than the usual array of sound control and customisation, which is always helpful in a gaming headset so you can use your phone rather than having to pause the game. It also has over 200 presets for various games and playstyles.

It seems like overkill, but in a headset where money is being saved being able to dial in the sound this way means you can push the hardware available much further. It's still almost certainly overkill, but I'd be keen to see how much they can elevate this new Arctis to maybe even something like the Arctis 7, or 5X.

These headsets should be up on the SteelSeries website now, retailing for $199 AUD, $119.99 USD, or €109.99. If these specs and hardware can push this midranged headset into the upper levels then that's a huge win for SteelSeries, and for the consumer. Just think, you can use the money you've saved on your headset for more important things, like the same headset in the other cool colour.


Best gaming mouse: the top rodents for gaming
Best gaming keyboard: your PC's best friend...
Best gaming headset: don't ignore in-game audio

]]>
https://www.pcgamer.com/hardware/gaming-keyboards/the-new-arctis-nova-3-looks-like-steelseries-is-min-maxing-the-midranged-headset-with-both-hardware-and-software/ VTLKzJZ3pqoVxfRkfxMz3n Wed, 04 Jun 2025 04:32:56 +0000
<![CDATA[ Remedy announces very friendly minimum specs for its co-op Control spinoff, FBC: Firebreak. You only need a GTX 1070 for 1080 60 fps play ]]> I've been hanging out for Remedy's next game after falling head over heels in love with the developer's last big hit, Control. So when FBC: Firebreak was announced by the brand last year, I was delighted to see the team is bringing a cooperative shooter into this twisted world. Control resonated with me so intensely that it's in my top ten games of all time. This is in no small part thanks to how deeply weird the game can be, and FBC: Firebreak is looking to ramp this right up for some silly fire-zombie shenanigans.

With any new game release, there's always the question of hardware. And given how much of a ray tracing demon Control was for its time, I was a little worried about how demanding FBC might end up. There's not much reason to get excited for a game that my machine can't even run. Well, thankfully, Remedy recently announced the required specs over on BlueSky, and this is looking like a game that will still run, even on a fairly dated rig.

So not only is FBC: Firebreak looking to respect my time, it might respect my wallet as well.

To run at 1080p 60 with quality upscaling, you only need a Nvidia GTX 1070, or an AMD RX 5600 XT. This needs to be paired with at least an Intel i5-700k or an AMD Ryzen 5 1600X, as well as 16GB of RAM. You'll also need 30 GB of SSD storage to store and run the game from. Really, that's some super minimal stuff. While it'll only run the game on low settings, I'm impressed it would do it at all, so props to Remedy for giving some of us still rocking the older hardware a chance.

Naturally, the recommended specs are a fair bit more demanding than the minimum. These will let you run the game on medium settings at 1440p 60 FPS with quality-based upscaling. For this, you'll be needing a much more recent GPU, with at least an Nvidia RTX 3060 or an AMD RX 6600 XT, giving you at least an extra 2 VRAM over those minimum specs. It's recommended to pair these with at least an Intel i5-8500 or AMD's Ryzen 5 2600 CPU and still have at least 16 GB of RAM.

FBC: Firebreak specs

(Image credit: Remedy Entertainment)

To step that upscaling up to 4k performance levels, Remedy recommends upgrading that GPU. An RTX 3070 or RX 6800 XT can level this gaming experience up to High presets. If nothing else, this tier demonstrates just how far a GPU upgrade can really go to helping you enjoy this game built for chaotic joy.

If you want the high-definition Ray Tracing experience of your lives, then Remedy's High Ray Tracing tier is probably what you're after. These settings deliver a 4k 60 FPS experience with performance upscaling like before, as well as High ray tracing enabled. Hold your breath, mind, as Remedy recommends you're packing an RTX 4080 or RX 9070 XT before attempting these settings. Your CPU also needs an upgrade, with an i7-8700k or 5 3600 specified. The good news is that your 16 GB of RAM should still be enough, so you can save some cash there at least.

And look, I'm not going to lie. Control looks amazing on rigs that can do it justice, and I'm betting FBC: Firebreak will too. All those fire infected hiss-like creatures will no doubt look incredible reflecting off the shiny walls of the oldest house offices, but they don't have to. I'm excited to be able to play this game with more friends than I initially expected, even if their weird pink blobs of goo don't look as good as mine.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/remedy-announces-very-friendly-minimum-specs-for-its-control-spinoff-fbc-firebreak-you-only-need-a-gtx-1070-for-1080-60-fps-play/ z2EBraEnejV92X2QFudjoX Wed, 04 Jun 2025 02:52:56 +0000
<![CDATA[ The Witcher 4's leafy glory is all down to Epic's Nanite Foliage, and largely so is the fact it can run ray-traced at 60 fps on a lowly PS5 ]]>

As the opener to an annual tech event, it's pretty hard to beat showing off what The Witcher 4 will look and run like in actual gameplay, but that's exactly what Epic and CD Projekt Red did for the former's keynote speech, State of Unreal 2025.

We all knew it was coming, of course, but sitting in a crowd of developers, creators, and journalists in Orlando, eardrums being blasted away with surround sound Ciri, it was clear that everyone wanted to be absolutely wowed at what The Witcher 4's developers had in store for us.

The thing is, we kind of knew that too, because rather than use its own RED Engine, CDPR switched had already announced its move to Unreal Engine 5 for The Witcher 4.

So I didn't think there wouldn't be anything that new to really show off. And, to a certain degree, that was very much the case, but there was one standout technological improvement Epic will eventually be bringing out with Unreal Engine 5.7, and one thing that looks to be making The Witcher 4's environment so dense, lush, and detailed, and it's Nanite Foliage. The first iteration of Epic's Nanite tech (a virtualised geometry engine) was for improving the surfaces of rocks, walls, and other large objects, and now it's all about leaves. Lots and lots of leaves.

What Epic and CDPR demonstrated with The Witcher 4 showed that Nanite Foliage is going to be a much more popular system. That's because it use voxels—instead of rasterized triangles, for handling leaves, needles, branches, etc—when the camera moves away, to a point where you can't tell that what was a highly detailed tree is now just a haze of blobs.

The voxels still get lit as part of Lumen and all generate real-time shadows, and the overall look is very convincing. At one point in the CDPR tech demo, the camera swoops down and rushes through a dense forest: at no point do you see any stuttering or lag or obvious LOD changes, just a seamless rush through countless trees.

Naturally, because it's all done using Nanite, the overall landscape, ground, lakes, rivers, and so on is just as fast and smooth. Letting the side down somewhat is the shadowing, as there are clear moments in the demo where shadows cast by trees and other objects popped into view, as the camera hurtled about. That might all be improved by the time The Witcher 4 is released, of course, but it might also just be a limitation of the fact that a standard PS5 was being used to run the demo.

Just a standard PlayStation 5, not a Pro, running the tech demo at a solid 60 fps with ray tracing. Given that the "main goal" for UE 5.6 is performance not necessarily pretties, that's pretty pleasing.

Nanite Foliage in The Witcher 4 tech demo

(Image credit: Epic)

With the next chapter in the Witcher games aiming to be ray-traced and running at 60 fps on that particular platform, it bodes well for what the PC version might be capable of. That said, we thought about Cyberpunk 2077 and looked how that turned out.

In fairness, the CP2077 one can play now is vastly superior to the game's original launch state, but perhaps with CDPR choosing UE5 for The Witcher 4, the developers will be able to focus far more on content, assets, gameplay, and optimisation, rather than just having to get the damned thing to work in the first place.

Anyway, the breakdown of what Epic's Nanite Foliage system will do garnered the biggest audience reaction of the whole Witcher tech demo, which could possibly mean that we'll see more open-world games in the future sporting truly epic scales of vegetation and all things nature-wise. Or should that be Epic scales?

Best SSD for gaming: The best speedy storage today.
Best NVMe SSD: Compact M.2 drives.
Best external hard drive: Huge capacities for less.
Best external SSD: Plug-in storage upgrades.

]]>
https://www.pcgamer.com/hardware/the-witcher-4s-leafy-glory-is-all-down-to-epics-nanite-foliage-and-largely-so-is-the-fact-it-can-run-ray-traced-at-60-fps-on-a-lowly-ps5/ QBxStQe5EK3Na3wJvsYERU Tue, 03 Jun 2025 16:38:07 +0000
<![CDATA[ Unreal's MetaHuman Animator can generate surprisingly lifelike animation with just your phone and it's now integrated directly into Unreal Engine 5.6 ]]> The 2025 State of Unreal gave us some long-awaited updates on the tech powering The Witcher 4, and while I was picking my jaw up from the floor, something new found its way onto my screen—a bizarre animated alien rendered from a webcam feed in real time.

Epic's new updates to MetaHuman, its suite of lifelike digital human character tools, are astoundingly cool, and feel like they may help lower the barrier to entry for budding developers.

Effectively, the entire suite is intended for use by developers to make character models with high levels of detail more easily, which can then be exported for use in projects.

With the launch of Unreal Engine 5.6, it has been integrated directly into the engine to optimize workflow. There are a few main elements of MetaHuman getting updates.

MetaHuman Creator is a cloud-enhanced bit of software for building hyper-realistic models. This is now getting a body component, for further customization, but Epic Games has also announced the ability to get its source code, which should give users more of an ability to customize it.

The new body system also comes with the ability to add clothes, which will dynamically wrap around models. This effectively means you don't have to create new clothes for new body types, and clothes packages can be bought and sold on a separate marketplace.

These are all interesting additions to a very impressive tool, but it's the MetaHuman Animator that really wowed me (that's the one with the alien I was talking about). Effectively, it can take footage from a webcam or phone, and plant it onto a MetaHuman, and it's all done in real-time.

Actors don't even have to wear custom suits, as is traditionally donned in motion capture, and it can be done on mono capture devices (a video source without depth, like what you might find in a head mounter camera).

Above: Unreal Engine 5 MetaHuman Animate Tool shown off in real-time. Credit: Epic Games.

The presentation of this tool has a 'reality cam' so that it can show the difference between real capture and that of the animation. There's almost no lag in between the two clips, and facial expressions are captured with ease. The actor moves their mouth to their left, and the alien's mouth texture stretches in a similar fashion.

Perhaps most impressively, MetaHuman Animator can be used with an audio source and character model alone, and can "take into account the emotion of the speaker from the audio to provide more lifelike animation".

You can adjust the emotionality of animation, and it will even add head movement to your character to seem more realistic.

An actor using the Unreal Engine MetaHuman Animate tool

(Image credit: Epic Games)

This tech feels like it could be a major boon to developers, and the fact that it's both built into Unreal Engine 5.6 and creators are encouraged to use the Fab asset store feels like an excellent way for smaller teams to get their hands on expressive and real-feeling digital characters.

Digital characters made with MetaHuman can also be used in other engines, too.

I can't believe I'm saying this as someone who isn't a developer, but this has made me genuinely excited for a tool in a game engine.

Time will only tell how well this works for development teams, but the full launch should encourage them to go hands-on with it.


Best gaming monitor: Pixel-perfect panels.
Best high refresh rate monitor: Screaming quick.
Best 4K monitor for gaming: High-res only.
Best 4K TV for gaming: Big-screen 4K PC gaming.

]]>
https://www.pcgamer.com/hardware/unreals-metahuman-animator-can-generate-surprisingly-lifelike-animation-with-just-your-phone-and-its-now-integrated-directly-into-unreal-engine-5-6/ bLmTjSzqLvuzEphyWxT9YZ Tue, 03 Jun 2025 15:54:09 +0000
<![CDATA[ The 'main goal' for Epic Games' new Unreal 5.6 engine is more performance on the PS5 and that should be good news for gaming on affordable PC hardware ]]> Epic Games has been showing off its latest Unreal 5.6 game engine. While it's full of technical innovations and new graphical and physics features, the main takeaway is that the latest version of arguably the most important video game engine is all about performance.

Indeed, Epic's Senior Director of Framework Engineering Julien Marchand says the "main goal" for Unreal 5.6 is performance. After showcasing a new open world Witcher 4 demo from CD Projekt Red, which we'll be covering in detail later today, Marchand said, "an open world like that was just the perfect content to focus on our main goal for the release, which was performance. We want the engine to scale to large and rich worlds, we want to preserve graphics quality, and of course we want to deliver high frame rates that you and your players expect."

Epic showed the Witcher 4 demo running on plain old Sony PS5 hardware, not even the new PS5 Pro, at a claimed 60 fps and with ray tracing enabled. The render resolution wasn't mentioned, but the visuals, including dense foliage, animal hair and muscle animation, cloth physics and more looked pretty stunning.

Of course, the PS5's APU is pretty modest by modern gaming PC standards. So, anything that runs well on PS5 ought to fly on even pretty modest PC hardware.

Anyway, to get Unreal running faster, a number of optimizations and new features have been added. For starters, there's the new Nanite Foliage system, an adaptive voxel representation for creating foliage, trees and the like.

It works by replacing distant triangles with cubes that are smaller than individual pixels. According to Epic the new system is "super fast" to render and "allows artists to render whatever amount of foliage is needed to achieve their vision without compromise."

"The only way to render a forest that looks as good as this one is to use 3D geometry for all parts of the tree," Marchand said. The clever bit is that the entire forest is composed of just 28 modular tree parts, each instanced thousands of times and used as building blocks. The foliage system also supports skeletal animation that runs on the GPU.

The only catch here is that the new foliage system seen in the Witcher 4 demo isn't due to be available to the wider game dev community until the next Unreal 5.7 update. Pity.

Anyway, next up is a new animation framework. The idea here is to support loads of detailed, animated game characters. Epic showed a large crowd of "over 300 animated skeletal mesh agents, all going about their business" in the Witcher 4 demo while maintaining 60 fps and with performance "room to spare" in the main game thread.

Some universal performance optimizations have also been applied to Unreal's lighting engine. "Ray tracing and Lumen now run more than twice as fast, with no visual trade off compared to when Unreal Engine 5 was released," Marchand said, "and because of that we can now hit 60 fps on consoles."

Unreal 5.6

Unreal 5.6 includes "procedural muscle animations" to make deformation and motion rendering more natural. (Image credit: CD Projekt Red)

Unreal's "Chaos" physics engine has also been given an overhaul. Improvements to Chaos Cloth mean that multiple fabrics and garments now interact faster and more accurately, while with "Chaos Flesh", Epic has introduced, "procedural muscle animations to make deformation and motion more natural."

Another neat detail is pre-baked fluid dynamics. "We now support baking of fluid surface data into light-weight assets to be played back at low cost. This is going to give you high fidelity water without the performance cost of real-time simulation," Marchand explained.

There's a whole bunch of other stuff, all aimed at getting a better visual and gameplay experience out of ye olde Playstation 5 hardware. And that's got to be a good thing for the PC.

Unreal 5.6

Pre-baked fluid dynamics will allow realistic water without the frame-rate hit. (Image credit: CD Projekt Red)

The PS5 makes do with eight pretty ancient AMD Zen 2-spec CPU cores and 36 AMD RDNA 2 graphics CUs. The graphics CU count is slightly more than the 32 CUs of the AMD Radeon RX 6600 XT, which was a fairly low end PC GPU back at launch in 2021.

On paper, the new AMD Radeon RX 9060 XT has the same 32 CU count. But it's two generations newer and so has around 2.5 times the raw computational power of the 6600 XT for general rendering tasks, plus massively upgraded ray tracing hardware.

In short, the likes of the RX 9060 XT should make mincemeat of anything that runs well on PS5 hardware. Of course, such on-paper assumptions don't always play out in practice, what with poorly optimized console ports. But it's certainly hard to watch the new Unreal 5.6 demo showcasing Witcher 4 and not come away both impressed and hopeful for what might be possible on sensible PC hardware very soon.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/the-main-goal-for-epic-games-new-unreal-5-6-engine-is-more-performance-on-the-ps5-and-that-should-be-good-news-for-gaming-on-affordable-pc-hardware/ PC3FB8j7Yfvr5KtYsc5XL Tue, 03 Jun 2025 15:53:14 +0000
<![CDATA[ Microsoft agrees that USB is a mess and it's making changes to fix it: 'Manufacturers can implement ports that look identical but differ wildly in functionality' ]]> Have you ever gone to plug in your USB Type-C cable to extend your display, ready for the glory of a second monitor, only for nothing to happen? Well, there's a good chance it's because your USB port isn't quite capable of doing so. Microsoft is implementing a new system to try and stop this problem.

As announced in the Microsoft USB Blog (which I've just discovered is a thing thanks to Tom's Hardware), writer Ugan-Sivagnanenthirarajah talked a little about the rationale behind this decision and how Microsoft plan on implementing it.

Microsoft has introduced new goals that suppliers looking to have Microsoft's stamp of approval must meet before being able to put Windows on their machines. Microsoft wants partners to make sure USB data, charging, and display capability "just works" on all its USB Type-C ports. Microsoft also makes sure its partners ensure that USB Type-C ports with 40 Gbps ports also "give full compatibility with USB4 and Thunderbolt 3 peripherals."

In order for hardware to launch with Windows 11, version 24H2, companies must achieve both goals, as well as following all other rules of the WHCP compatibility program.

To ensure partners meet the standard of making sure USB ports just work, all WHCP-compliant hardware with USB Type-C must have USB-IF-certified silicon in their PC, and all devices plugged in must "charge efficiently and consistently as every USB-C port on a certified PC needs to support USB Power Delivery charging." All Type-C ports must support DisplayPort Alt-Mode for connecting extra monitors and have to be "validated using Microsoft's built-in USB controlling drivers."

A screenshot from the Microsoft blog going over the difference between USB 4 and USB4, according to WHCP requirements

(Image credit: Microsoft)

For a port to be certified as USB4, it must be capable of a 40 Gbps max data speed or more, must offer up to 15 W of power to accessories (with that being 7.5 W for tablets), has to support PC charging, must support a minimum of dual 4K at 60 Hz display, and has to have both PCIe support and Thunderbolt 3 compatibility.

The testing process is implemented via Microsoft's HLK (Hardware Lab Kit), and "OEMs, silicon vendors, and accessory brands run these HLK tests and submit logs to Microsoft. Any failure halts certification until the issue is resolved in hardware or firmware."

And if you're wondering why Microsoft is going through all this trouble—have you used a USB port on a modern PC lately? It's all kinds of bad, and the naming for each USB generation is just as bad. I just hope Microsoft's efforts are actually effective enough to do something about USB confusion.

Microsoft points out that "manufacturers can implement ports that look identical but differ wildly in functionality," so customers can be misled by the ports. This new process to implement standardized Type-C functionality is intended to give some clarity on that, and give assurance when you buy a WHCP-certified machine.

This new process is a good thing for Microsoft to implement, and Microsoft is likely looking for a bit of a PR win right now. It fired just under 7,000 employees just last month, and Microsoft's Build conference was interrupted by protests in regard to ties to the Israeli military, with further pressure from BDS (Boycott, Divestment, and Sanctions).


Windows 11 review: What we think of the latest OS.
How to install Windows 11: Guide to a secure install.
Windows 11 TPM requirement: Strict OS security.

]]>
https://www.pcgamer.com/hardware/microsoft-agrees-that-usb-is-a-mess-and-its-making-changes-to-fix-it-manufacturers-can-implement-ports-that-look-identical-but-differ-wildly-in-functionality/ GYtrCH39Ruf5F7jmpK7jub Tue, 03 Jun 2025 13:42:03 +0000
<![CDATA[ The world's biggest chip maker TSMC says it still can't keep up with demand for AI hardware despite tariff uncertainty ]]> Uncertainty thanks to wildly fluctuating tariffs might be making it hard to predict how much your next graphics card is going to cost. But apparently it hasn't stopped Taiwanese chip fab TSMC from cranking out as many GPUs as humanly possible.

Speaking at TSMC's annual shareholder's meeting today (via Reuters), CEO C.C. Wei said tariffs had yet to significantly impact the company. "I can assure you that AI demand has always been very strong and it's consistently outpacing supply," Wei said.

He did concede that TSMC isn't entirely immune to tariffs. "Tariffs do have some impact on TSMC, but not directly. That's because tariffs are imposed on importers, not exporters. TSMC is an exporter. However, tariffs can lead to slightly higher prices, and when prices go up, demand may go down," he said.

But as things stand, the company is still running at maximum production volume. One aspect of TSMC's operations that perhaps aren't going at full tilt, however, is its investment in new production facilities or fabs in the US.

Reuters says Wei and President Trump have discussed difficulties in completing TSMC's full investment program, which totals $165 billion, within five years. Apparently, part of the problem is that tariffs increase the cost of production in the US because some equipment required to build out the fabs has to be imported from Asia. Wei says that President Trump told him, "Mr Wei, do your best, that's good enough."

Anywho, if you've been wondering whether there's any chance that tariffs might somehow conspire to make GPUs for gaming a little easier to come by, perhaps courtesy of reducing demand for AI chips, that doesn't seem to be the case.

That said, outside of the US, many if not most of Nvidia's GPUs can now be had at MSRP or even lower. So, it does look like graphics card pricing is finally trending towards something approaching normality.

It is, of course, very hard to say what will happen next. The US courts seem to be moving closer to putting guard rails around the more volatile elements of Trump's tariff fluctuations.

But until that emerging dispute has been escalated all the way to the Supreme Court, it's impossible to say how it will all play out. Put it this way, you'd be very brave to say it's game over for Trump's tariff escapades.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/processors/the-worlds-biggest-chip-maker-tsmc-says-it-still-cant-keep-up-with-demand-for-ai-hardware-despite-tariff-uncertainty/ cqfdo8ZPjZBCNztNde3X3n Tue, 03 Jun 2025 11:03:18 +0000
<![CDATA[ Hell is Us system requirements require an RTX 4090 and upscaling for 4K at 30 fps, which sounds pretty hellish for my gaming PC ]]> If you have the might of an RTX 4090 powering your rig, the Hell is Us system requirements suggest you will be able to run the game at Ultra settings at 4K, aiming for 30 fps average. Curiously, system requirements are all inclusive of upscaling. That's, well, not great, but there is a little nuance as to why Hell is Us' publisher has announced those numbers and what they mean.

Ahead of its release on September 4, the Hell is Us team took to X to announce the game's system requirements. Notably, it's a wider spread of settings than many system requirements, ranging from 1080p 30 fps on Medium settings, going all the way up to 4K 30 fps on Ultra, over five separate recommendation settings.

To run the game on minimum, you will need at least an Intel Core i7 7700K / AMD Ryzen 3 3300X with either an Nvidia GTX 1070 or AMD RX 5600 XT. Alongside this, you will need 30 GB of free space on an SSD and 16 GB of RAM. Considering it's run in Unreal Engine 5, these specs don't seem too bad.

Recommended specs suggest an Intel Core i7 11700K / AMD Ryzen 5 7600 and either an Nvidia RTX 2080 Ti or an AMD RX 6750 XT. This will get you a performance of 1080p on High settings, with an average fps of 60.

It's the Ultra settings that seem the most out of place here. The CPU doesn't change from recommended, and neither does the memory, but you are expected to have an Nvidia RTX 4090 or AMD RX 7900 XTX to run the game on Ultra settings at 4K, averaging 30 fps.

Hell is Us system requirements

Minimum

Recommended

High

Very high

Ultra

OS

Windows 10 (64-bit)

Windows 10 (64-bit)

Windows 10 (64-bit)

Windows 10 (64-bit)

Windows 10 (64-bit)

CPU

Intel Core i7 7700K / AMD Ryzen 3 3300X

Intel Core i7 11700K / AMD Ryzen 5 7600

Intel Core i7 11700K / AMD Ryzen 5 7600

Intel Core i7 11700K / AMD Ryzen 5 7600

Intel Core i7 11700K / AMD Ryzen 5 7600

GPU

Nvidia GTX 1070 / AMD RX 5600 XT

Nvidia RTX 2080 Ti / AMD RX 6750 XT

Nvidia RTX 3080 / AMD RX 6800 XT

Nvidia RTX 3090 / AMD RX 6900 XT

Nvidia RTX 4090 / AMD RX 7900 XTX

Memory

16 GB

16 GB

16 GB

16 GB

16 GB

Storage

30 GB SSD

30 GB SSD

30 GB SSD

30 GB SSD

30 GB SSD

Notes

1080 30 fps (Medium)

1080p 60 fps (High)

1440p 60 fps (High)

1440p 60 fps (Very high)

4K 30 fps (Ultra)

Citing 'passionate discussion', a Steam news post elaborated on the Hell is Us system requirements. In that, the Hell is Us team shared what kind of upscaling is present in these system requirements. The game uses TSR (Unreal Engine's own Temporal Super Resolution upscaling), and Low settings use 50% upscaling, where Ultra uses 10%. It also supports DLSS, XeSS, and FSR, but the upscaling mentioned here is only Unreal's upscaling.

Hell is Us is built on Unreal Engine 5, which is an engine infamous for being a bit tough on hardware, thanks in part to fancy reflection and lighting engines. The game is being developed by Rogue Factor, which previously released games like Mordheim: City of the Damned and Necromunda: Underhive Wars. Hell is Us, unlike those previous two games, is an entirely original work, and it made quite an impression after years of silence at Sony's State of Play in September last year.

The team has clarified that the system requirements are "ideal specs" and it "doesn’t mean you can’t run the game on lower settings". The team continues: "We’d rather be cautious than risk disappointing players by overpromising."

If you're worried your rig can't run in or don't quite trust those figures, you can download a demo for the game via the Hell is Us Steam page right now and test it for yourself.


Best gaming PC: The top pre-built machines.
Best gaming laptop: Great devices for mobile gaming.

]]>
https://www.pcgamer.com/hardware/hell-is-us-system-requirements-require-an-rtx-4090-and-upscaling-for-4k-at-30-fps-which-sounds-pretty-hellish-for-my-gaming-pc/ Zwyf8XphsdU5NtBtz6NU25 Tue, 03 Jun 2025 10:22:49 +0000
<![CDATA[ Nvidia's new Arm-based APU rumoured to launch in an Alienware laptop later this year with RTX 4070 mobile performance and 'breakthrough' power efficiency ]]> Nvidia's long-rumoured APU for gaming laptops didn't emerge at the recent Computex show. But now there's talk that it could appear by the end of the year powering a new Alienware laptop and offer gaming performance on par with a RTX 4070 mobile GPU while consuming barely more than half the power.

According to a report in the United Daily News, a Taiwanese outlet, the new APU will launch either in the final quarter of 2025 or early in 2026. As previously rumoured, it's said Nvidia is producing the new chip in co-operation with MediaTek, the latter being a specialist in designing chips with Arm cores.

Intriguingly, the UDN story claims the new chip sports, "a customized Arm architecture CPU." One of the big unknowns with Nvidia's upcoming APU is the question of whether it uses off-the-shelf CPU cores designed by Arm or whether Nvidia has designed its own cores that are compatible with the Arm instruction set.

By way of example, Apple has taken the latter approach with its M series chips and currently produces what are widely agreed to be the most efficient CPU cores currently available as a consequence. Arm's in-house CPU designs are decent enough, but Nvidia-designed CPU cores would certainly be more exciting. The UDN story implies the cores will indeed be Nvidia designed, but that is yet to be proven.

Up top we mentioned how this new Nvidia chip is going into an Alienware gaming laptop. That begs two immediate questions. First, what kind of graphics hardware will it have? Second, how will it cope with existing games designed for x86 CPUs from Intel and AMD rather than Arm cores?

UDN answers the first query in part, saying that the chip will offer an integrated GPU based on Nvidia's latest Blackwell architecture as used by various RTX 50 generation GPUs, such as the RTX 5070. The publication goes on to say that the chip offers, "the same level of performance as a 120 W RTX 4070 notebook," but does so at just 65 W and will therefore represent a "breakthrough" in power efficiency that will enable smaller and lighter gaming laptops.

If that sounds exciting, the perennial problem of software compatibility remains. The chip will presumably run Windows on Arm and therefore rely on Microsoft's Prism translation layer to support legacy PC games designed for x86 CPUs.

Thus far Prism has been a bit hit and miss when running on Qualcomm's Snapdragon X Arm chips for PCs. You would expect Nvidia to offer superior drivers for its Arm chip, but doubts remains over the basic approach of x86 emulation for running games.

Of course, if anyone can encourage game developers to release native Arm versions of various titles and entirely sidestep the emulation problem, it'll be Nvidia. So, if any company can make PC gaming on an Arm-based CPU viable, it's probably Nvidia. But there's still much to be proven.

Further details like the cost of the new device are also unknown. But the basic proposition of a more portable gaming laptop with much improved battery life is undeniably exciting. Nvidia's CEO has previously confirmed that an Arm-based APU for PCs is definitely coming. So, we can't wait to see what team green has come up with.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/processors/nvidias-new-arm-based-apu-rumoured-to-launch-in-an-alienware-laptop-later-this-year-with-rtx-4070-mobile-performance-and-breakthrough-power-efficiency/ QWgUnL7SHFKUqAdi2VumPC Tue, 03 Jun 2025 09:59:00 +0000
<![CDATA[ Over $90,000 worth of RTX 5090 GPUs have been replaced in-box by crossbody backpacks at just one Silicon Valley Micro Center ]]> What started out as just another gamer finding their brand new graphics card missing and replaced in the box by some random piece of junk, has turned into a potentially big headache for Zotac and a compromised Chinese factory shipping not-RTX 5090s out the front door.

Late last week one redditor posted about their poor luck in having been sold a brand new Zotac RTX 5090 Solid OC card from a new Micro Center store in Santa Clara, only to get it home some four hours later to discover that their dream GPU was, in fact, "3 crossbody backpacks inside rather than my 5090."

UPDATE: The “scam” Zotac 5090 with backpacks inside from r/Microcenter

The inevitable regret at not having opened the box in store or at least captured the unboxing on video was washed away the following day when they returned to the store where staff had actually seen their reddit post already and had found 31 other boxed up backpacks within Zotac RTX 5090 packaging.

The second update post details the follow up trip and thanks the staff at the Santa Clara Micro Center for their efforts and the speed with which they replaced the OP's graphics card.

Videocardz followed up with Micro Center and was told that while they had tracked it back to their supplier they now believed the Zotac cards in question were "compromised at the Zotac China factory before they even shipped to the US."

That immediately makes this a far more worrisome story than just a single user getting shipped a brick inside a GPU box. The fact that a single store has discovered 32 missing RTX 5090s doesn't make me think that's an isolated incident, being just a problem with one delivery for one shop.

Your next upgrade

Nvidia RTX 5090 Founders Edition graphics card on different backgrounds

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

I've been to that Zotac China factory in Shenzhen and it's a big ol' facility. I wouldn't be surprised to hear of more missing Zotac GPUs in the next month—surely more than just 32 cards would have been replaced in a shipment bound for the US if it really was an issue at source.

That would mean it's potentially far more than just $92,800 worth of Nvidia graphics cards missing from retail boxes, and I can almost hear the sounds of box cutters ripping into Zotac GPU packaging across retailers all over the country searching through their stock rooms.

I've reached out to Zotac for comment and will update on its assessment of the scale of the problem if we hear back.

And if you're picking up a Zotac card in the near future I'd recommend either opening it in-store or taking some video of you opening up the box just to be safe. Not all retail stores are going to be as accommodating as the Micro Center staff seemed to have been this time.

]]>
https://www.pcgamer.com/hardware/graphics-cards/over-usd90-000-worth-of-rtx-5090-gpus-have-been-replaced-in-box-by-crossbody-backpacks-at-just-one-silicon-valley-micro-center/ Km7Xv4DMTqdHQyG6Hgs5X4 Mon, 02 Jun 2025 16:10:52 +0000
<![CDATA[ Asus ROG Pelta review ]]> My first thought when I saw the Asus ROG Pelta was, 'Really, Asus went for that name?' The second thought was 'boy, these really feel like the Steelseries Arctis Nova 7'. That second thought certainly isn't a bad one.

The Pelta has a notably soft and comfortable texture—one that doesn't stand out in the looks department but is much appreciated after a long day's work. Coming into the summer months, their light, breathable texture is a welcome sidestep from the Corsair Virtuoso Max I was previously using day to day.

As it is a gaming headset, the ROG Pelta inevitably comes with RGB lighting on the side—with all of its battery sucking trappings—but is otherwise relatively understated. The mic can even be unplugged, which, when paired with the fact that the Pelta can connect to a phone or tablet via Bluetooth, makes it a reliable choice should you want to go for a walk with it on.

There's no worry of your ears overheating, and the light clamping force gives the headset a small amount of protection from outside sounds. It certainly doesn't have active noise cancelling, which I don't hold against it at this price point, but it does a satisfactory job of immersing in music on the go.

Asus ROG Pelta specs

Asus ROG Pelta

(Image credit: Future)

Connection: 2.4 GHz, Bluetooth, USB-C
Type: Closed back
Frequency response: 20 Hz - 20 kHz
Drivers: 50mm titanium-plated diaphragm drivers
Microphone: Unidirectional
Features: Tri-Mode connectivity
Weight: 309 g
Price: $130 | £125

None of that would matter if the sound wasn't strong, but luckily, those 50 mm titanium-plated drivers come through clear and loud.

In Doom: The Dark Ages, every inch of the game's music feels well tuned in the headset, pulsing and punching in step with the game's mighty guitars. However, the feeling of the Doom Slayer's feet stomping on the pavement doesn't quite match the weight I was expecting with the standard EQ settings.

Pumping up the bass certainly helps, but the headset doesn't quite have the wurble you may want from authentic bass. It's adequate, but not quite as real sounding as some headsets.

The mids and highs, however, are clear and well-tuned. The intricacies of sound come through well, and directional atmospheric noises are pointed and easy to pick out. This makes listening for footsteps in the likes of Valorant incredibly easy. For its price point, the Pelta really delivers with its drivers.

There's a nuanced fright to the metallic clangs and rhythmic pounding of Buckshot Roulette that adds to the already creepy atmosphere of, well, every other part of the game. The comfort of the headset only adds to this feeling, never daring to intrude on the immersion of whatever game you're playing. It handles music well, too.

Genesis Noir's jazzy, vibey soundtrack slickly oozes out of the Asus ROG Pelta, with both the squealing of smooth sax and the pitter-patter of jazz brushes coming through clear.

In the UK, the ROG Pelta has gone as low as £95, and for that, I don't think I can name a single headset I'd pick over the Pelta right now. However, the US doesn't have it quite as good, only getting a couple of dollars off the headset here and there.

At this point, it puts up a fight against the likes of the Turtle Beach Atlas Air and Steelseries Arctis Nova 5X. It is over $100 cheaper than the Delta II and certainly doesn't feel like a budget option when it comes to its sound.

As well as coming with Bluetooth, the Pelta has a 2.4 GHz USB-C dongle and can be plugged in via a USB-C. The only connectivity option it's really missing is a 3.5 mm port, but the amount of versatility you'd gain from that inclusion would actually make the side controls worse.

As they stand, the on-ear controls are totally fine, and only on the left ear (like the Arctis Nova Pro). The volume wheel can be accessed easily, then there's a switch to flip between connectivity modes, and muting the mic makes a red light pop up on the side, which is great when you've forgotten if you're muted or not.

The one place this could improve is a quick switch on the mic or some sort of smart mute mic mechanism, as it's a button on the left earcup, and one that always takes me a few moments to find.

Listen to the microphone test here:

The mic performs around the middle of the pack for its price range, coming through fairly clear but not beating out even a cheaper dedicated microphone. In meetings and games, I never felt like it faltered to cut through the noise.

The Pelta has 90 hours of battery life in Bluetooth mode with RGB lighting off and mic on mute. That figure drops to 60 hours with lighting on. The Pelta gets up to 70 hours of power with the lighting off and the mic muted in 2.4 GHz mode, and 45 hours with them both turned on. I practically only charged the headset when the thought occurred to me, and it never ran out. Fast charging means you can get up to 3 hours of listening time with a 15-minute charge, which means if it does run out, getting it up and running again isn't too much hassle.

Buy if…

✅ You want a headset that is light and comfy: Thanks to the pillowy texture on its sidecups and its light weight, the Pelta performs admirably in the heat and didn't irritate me after entire days worth of use.

✅ It's on sale: The Pelta is reasonably priced as is, but in some sales, it has become the only headset I'd personally pick up at its price point.

Don't buy if…

❌ You want a heavy-duty headset: Though I didn't feel like the headset was prone to breaking, it's definitely quite a light build, and I wouldn't trust it to be thrown around.

❌ You like a killer bass: The bass in the Pelta performs just fine, but doesn't have that rattle that some headsets have. A thunderous bass solo won't reverberate through your body, should you want such a fate.

The ROG software, Armoury Crate Gear, is pretty much exactly what I want from it: barely noticeable. It can change some AI noise filtering settings for the microphone, adjust the EQ, customise lighting, and, more importantly, get firmware updates where necessary.

If I could describe my time with the Asus ROG Pelta in a single word, I'd say smooth. It's comfortable, with a healthy battery life, decent microphone, and all-around great sound. The bass is a tad underwhelming, but this is one of the only real downsides from its reasonable price point.

The light material and great padding have been an absolute boon in hotter months, and connectivity is smooth and fast, with no cutouts in all those evenings spent gaming late into the night.

The Pelta's price point is a tough one for most headsets because there's so much competition at just above $100, but its quality means it doesn't get drowned out despite it.

]]>
https://www.pcgamer.com/hardware/gaming-headsets/asus-rog-pelta-review/ 6nTvrodywTK6oYzViHs2iF Mon, 02 Jun 2025 12:49:39 +0000
<![CDATA[ SteelSeries Rival 3 Gen 2 Wireless review ]]> I really wanted to love the SteelSeries Rival 3 Gen 2 Wireless, and in some ways, I do. It's very versatile, sturdy, and has reminded me of the very real benefits of removable batteries in gaming mice. It's also pretty well priced. Because of all this, I reckon it's ideal for some select use cases.

However, I can't recommend it for use as a primary gaming mouse, not when there are cheaper options that do similar, such as the Logitech G305 Lightspeed, or better options for a little more cash, such as the Turtle Beach Burst II Air or even the Logitech G Pro X Superlight.

But let's start off with the good. The SteelSeries Rival 3 Gen 2 Wireless shares many of the benefits of the original Rival 3, and the main allure—apart from its striking 'Aqua' and 'Lavender' colour options that you can choose instead of plain black or white—is that it has dual wireless connectivity and removable batteries. This makes for an incredibly versatile mouse that's a great Jack of all trades.

It has some other upgrades compared to the first edition—better battery life, PTFE feet, and click latency—but the main one is that it now supports rechargeable batteries.

Rival 3 Gen 2 Wireless specs

SteelSeries Rival 3 Gen 2 Wireless gaming mouse

(Image credit: Future)

Buttons: 6
Connectivity: 2.4 GHz, Bluetooth
Sensor: TrueMove Air Optical
Max DPI: 18,000
Weight: 106 g (2 x batteries), 95 g (1 x battery), 83.5 g (no batteries)
Max acceleration: 40 G
Max speed: 400 IPS
Polling rate: Up to 1,000 Hz
Battery life: Up to 200 hours (2.4 GHz) / 450 hours (Bluetooth)
RGB lighting: No (except on scroll wheel to highlight DPI/connection changes)
Price: $60 / £55

I've been a massive defender of wireless mice over the last few years, but I've always gone for the baked-in type. Now, after using the Rival 3 Gen 2 Wireless for a while, I must say I see the appeal. That's not only because you get some fantastic battery life out of removable batteries (up to 200 hours at 1,000 Hz polling with 2x AAA batteries, here), but also because you never have to plug your mouse in again if they're also rechargeable.

The main benefit of removable, rechargeable batteries in your gaming peripherals is one we discovered with the SteelSeries Arctis Nova Pro headset: It allows you to keep one set of batteries charging while the other one's in use, then swap them over when the battery gets low. This means you can essentially have infinite battery life, without ever having to plug in your mouse. Of course, you'll have to have your batteries plugged in and charging somewhere, but that's more convenient.

That kind of setup is possible with the SteelSeries Rival 3 Gen 2 Wireless, but only if you sort out your own rechargeable batteries and charging station. The mouse itself only comes with two single-use AAA batteries. Which means, of course, that if you want one of the main benefits of this mouse, you have to spend a bit extra.

SteelSeries Rival 3 Gen 2 Wireless gaming mouse on a SteelSeries mousepad with the company logo

(Image credit: Future)

In practice, I found battery life to match up to what's claimed on the spec sheet. I used just one battery at a time to keep the mouse weight a little lighter at 95 g, and I found that to last me just over a week of daily use for work and gaming. Then I just popped off the top and swapped in the new AAA battery.

The main downside to this setup is that 95 g weight. Stacked against some current competition, such as the aforementioned and admittedly more expensive Turtle Beach Burst II Air or Razer DeathAdder V3 HyperSpeed, it's just far too heavy.

There are lightweight options for cheaper than these, too. The Glorious Model O Wireless doesn't cost much more, for instance, and frequently drops down to about the same price as the Rival 3 Gen 2 Wireless, yet it weighs just 69 g. Ditto the NZXT Lift Elite Wireless which weighs just 57 g. You might have to wait a week or two to find such a lightweight mouse on sale, but before long you'll find something much lighter for about the same price as this Rival 3 Gen 2 Wireless.

This weight is par for the course for removable battery mice—just ask the G305 Lightspeed—but it's a drawback nonetheless, unless you're one of the few to prefer a heavier rodent.

SteelSeries Rival 3 Gen 2 Wireless gaming mouse with top open and battery inside

(Image credit: Future)

It's not just the weight itself, either; it's the distribution. Because the batteries sit more towards the back of the mouse, there's a definite drag at the rear end. I found this very noticeable, and while I've become familiar with it over the last couple of weeks of use, I still notice it.

It's not an uncomfortable mouse to use, mind. Far from it. The textured plastic surface feels very nice under my fingertips, and it feels incredibly comfy under my hybrid palm-claw grip. It's pretty shallow, though, so don't expect the pinnacle of hand support for all you palm grippers.

It's sturdy, too, and certainly feels worth its $60 MSRP price tag on the build quality front. That is, in all but one area, this being the scroll wheel.

SteelSeries Rival 3 Gen 2 Wireless gaming mouse upside down showing its soft little belly

(Image credit: Future)

Unfortunately, this mouse's wheel suffers (although not quite as badly) from what I distinctly remember my very first Razer DeathAdder mouse suffering many, many years ago: It's got a distinctive kind of mushiness to it that makes it feel a little… unsettled on its notches. As if it could slip into that no-go 'between the notches' zone at any moment. It's a difficult sensation to describe, but it suffices to say that it's not the best scroll wheel I've used.

And I'm glad I didn't get around to writing this review until a few weeks in with the mouse, because until now, that was the only complaint I had about the mouse wheel, but as of today, it's started occasionally squeaking, too. It doesn't happen all the time, but it's a particularly grating sound to my ears, and it's not a good sign that this has started after just a couple of weeks of use.

The other thing that bugs me about the Rival 3 Gen 2 Wireless is its bottom switch that allows you to toggle between Bluetooth, 2.4 GHz, and Off. It's pretty difficult to switch between 2.4 and Bluetooth, and God help you if you want to switch it to the middle Off position. To get enough pressure on the tucked-in little bugger to get it to actually move, that's then so much pressure that it overshoots the middle.

SteelSeries Rival 3 Gen 2 Wireless gaming mouse software

(Image credit: Future)

It's a good job the sleep functionality works a charm for that reason, because otherwise this thing's battery would drain a whole lot quicker. This can be toggled in the app settings, and I don't have any complaints on this front, really. You get plenty to fiddle with: polling, DPI, wireless stability enhancement (at the cost of battery life), and so on.

When it comes to gaming, I had no issues with the mouse other than occasionally being a little bothered by its rear weighting. I'm used to lightweight mice like the Logitech G Pro X Superlight, though, and not everyone might find that as big of a deal.

If maximising competitive performance is what you're going for, though, you're probably better off looking elsewhere. This mouse is still using the same TrueMove Air 18K sensor that the original used, and while 18,000 DPI and 400 IPS should be plenty for most people, it's often more about what those low numbers say about the sensor quality in general than the actual numbers they reach.

In the case of the SteelSeries Rival 3 Gen 2, my testing showed the sensor to be a little below average compared to some other gaming mice on the market right now. This primarily shows in the MouseTester sensor consistency results—there's a higher amount of tracking deviation than what you might want out of a sensor for, say, high-level gaming in pro shooters.

Buy if...

You want dual 2.4 and Bluetooth connection: This is a pretty well-priced gaming mouse to have such a feature. It's great if you switch between devices a lot.

Don't buy if...

❌ You want the best gaming performance: The Rival 3 Gen 2's sensor isn't as good as what you can get in some other gaming mice today, and the mouse weighs more than many, too.

I didn't run into any problems clicking heads in Counter-Strike 2 or tracking enemy Pharahs with Soldier's rifle in Overwatch 2 (well, no more problems than usual, anyway), but better-performing hands and eyes than mine might be able to notice the difference between this and a top-tier sensor.

So, with all this being said and done, why would one pick up a Rival 3 Gen 2 Wireless? Well, as I said at the start, I think this mouse is great as a Jack of all trades kind of deal—perhaps if you want to use it not primarily for gaming but for work and travel, too.

Its ability to run off rechargeable, removable batteries makes it attractive on the battery life front, and its dual 2.4 GHz and Bluetooth modes make it very versatile. Apart from the slight fiddliness of actually flicking the switch, swapping between the two connections is a breeze, and it allows me to go back and forth between using it with my gaming PC and my laptop at will, instantly.

If all of that is specifically your jam, then have at it. Just remember that you have to spend extra for rechargeable batteries and a battery charger if you want one of this mouse's main benefits.

For me, the weightiness, the mediocre sensor, and most importantly, the squeaky and slightly mushy scroll wheel make it a no-go, considering there are other great options on the market for a similar price. It's a shame, because it'd have been nice to keep on the rechargeable, reusable battery bandwagon more permanently, but the drawbacks aren't worth it here.

]]>
https://www.pcgamer.com/hardware/gaming-mice/steelseries-rival-3-gen-2-wireless-review/ NkGf4tymXDRU6Mtx2cBVVK Fri, 30 May 2025 16:06:52 +0000
<![CDATA[ The Kingmakers system requirements show that the hardest part of running the game may be finding 80 GB free for the install ]]> The Kingmakers system requirements have just been revealed, and it is looking incredibly easy to run. Given how ambitious the game is, and the fact that it's built in Unreal Engine 5 despite those specs, I can only hope it runs as easily as it is to play.

As spotted by PCGamesN, you have to go somewhere no person should ever set foot in to find the Kingmakers system requirements: the Epic Games Store. Notably, the Steam page for the game doesn't have system requirements yet.

To run Kingmakers on Minimum settings, you need a 10th Gen Intel Core i5 processor or better. This could be a chip like the Intel Core i5 10600K, a relatively modest chip from 2020 we rather liked.

Alongside this, you will need at least an RTX 2060. These two specs aren't too tough at all, given a budget to mid-range rig from 2020 can run Kingmakers.

The biggest problem many rigs from that era will have is finding the 80 GB of storage to actually download the game. Whereas the 8 GB memory requirement feels almost unheard of for a game of its size launching later this year.

Kingmakers system requirements

Minimum

Recommended

OS

Windows 10 x 64

Windows 10 x 64

CPU

10th gen Intel i5

10th gen Intel i7

GPU

RTX 2060

RTX 3070

Memory

8 GB

16 GB

Storage

80 GB

80 GB

Even running the game on Recommended settings isn't too bad. You will need a 10th Gen Intel Core i7 processor, or better, and an RTX 3070 to run the game. The RAM requirement jumps to a still pretty decent 16 GB, and, as is to be assumed, the storage requirements stay the same.

There are a few things worth noting about the way these system requirements are set up. For one, they're only available in the Epic Games Store, and before release, so they are subject to change. No commitment to system requirements on Steam is certainly a strange choice. It also lacks AMD in both the CPU and GPU categories (sorry team red), but I'm hoping we get clearer spec requirements closer to its release.

Kingmakers feels like a game conjured up by 12-year-old me. You play a soldier in the middle of a medieval battle, except you now have a gun, and there's also a time-bending element, explaining where you got all that gear from. It's part action game, part RTS, and built in Unreal Engine 5. This is a bit of a strange choice, as strategy games need to produce a high density of bodies, and UE5 shines in those close-up environments. Early trailers certainly look a tad grainy, but that could be down to any number of post effects or even (shudders) motion blur.

The game launches into early access on October 8, so we're hoping for a little more information on how the game runs, what it's about, and if we need to do any more upgrades to get it running. As the requirements look right now, there's a good chance you don't. I may just have to delete one or two things.

Best SSD for gaming: The best speedy storage today.
Best NVMe SSD: Compact M.2 drives.
Best external hard drive: Huge capacities for less.
Best external SSD: Plug-in storage upgrades.

]]>
https://www.pcgamer.com/hardware/the-kingmakers-system-requirements-show-that-the-hardest-part-of-running-the-game-may-be-finding-80-gb-free-for-the-install/ ogPc7UXVV8Ta7fmHx2vhhB Fri, 30 May 2025 16:06:38 +0000
<![CDATA[ ASRock acknowledges AMD is not at fault for recent CPU failures and recommends updating the BIOS now, though we still don't know if problems have actually been fixed ]]>

ASRock's motherboard saga has been an ongoing story for months now, with CPUs appearing to fail, ASRock finding debris at fault for one problem, and then over 100 cases of AMD 9800X3D chips dying in just a few short weeks. Just days ago, ASRock acknowledged problems with its board, with an accompanying BIOS fix, and an interview with the company may suggest this saga isn't quite done.

But first, the context. Users with both ASRock motherboards and AMD 9000-series chips found that something in that specific combination was causing CPUs to die. In an interview with Gamers Nexus, Chris Lee, the VP of motherboards at ASRock, explained the three central cases they found when analysing the problem.

The first, which is said to be a "small percentage", was simply down to user error. Debris finding its way into the socket or getting thermal paste in the wrong place are just some of the cases where user damage can be attributed to.

In "a lot of cases", after sending CPUs to AMD, it was found out that the CPUs were actually still operational. The cause for this problem is reportedly a mixture of memory compatibility issues and a BIOS problem.

BIOS update 3.25 launched just a few weeks ago (and can be accessed via the ASRock site), and Lee claims this fixes a problem where the Thermal Design Current (TDC) and Electric Design Current (EDC) were set too high for some AMD chips. The memory compatibility issue was reportedly fixed with the previous BIOS update, 3.20.

AMD Ryzen 7 9800X3D processor

(Image credit: Future)

This was explained a few days ago as a PBO (Precision Booster Overdrive) issue, as the two currents are linked to that mode, but it wasn't made clear which company was actually responsible. In the latest video, ASRock notably states it is not AMD's fault.

However, this does not account for users who don't have PBO active running into those problems, or claims that some users with the recommended BIOS update are still running into the same problem. As it's all still developing, those cases could be outliers, but users will need time with the new BIOS to see what issues it fixes. Notably, this update will be present on motherboards manufactured in the future, but not those that are already at retail. If you have an ASRock motherboard, make sure you're on the latest version of the BIOS.

Lee tells Gamers Nexus that if there's a problem with the CPU, ASRock will send it back to the user to go through the RMA process with their retailer. ASRock can't replace failing chips, but it could function as a middleman to retailers, which it is currently refusing to do. Notably, though, Lee says this is the case with defective CPUs, so there's a chance he could have misunderstood Gamers Nexus' question as one pertaining to already failing CPUs.

In the interview, Lee claims that ASRock did manage to get its hands on a CPU with scorch marks (indicating faults due to voltage shortage), and it has been sent off to AMD, but analysis has not yet come back.

Despite months of back and forth, it seems this story isn't quite complete, but either way, now is a good time to get that BIOS update, if you have a Ryzen X3D chip in an ASRock motherboard.


Best gaming PC: The top pre-built machines.
Best gaming laptop: Great devices for mobile gaming.

]]>
https://www.pcgamer.com/hardware/motherboards/asrock-acknowledges-amd-is-not-at-fault-for-recent-cpu-failures-and-recommends-updating-the-bios-now-though-we-still-dont-know-if-problems-have-actually-been-fixed/ Zvokvzf2d4FpnhFkLqPagP Fri, 30 May 2025 15:05:38 +0000
<![CDATA[ A graphical history of id Tech: Three decades of cutting-edge graphics and game engine technologies ]]>

Over 30 years ago, way back in 1992, developers id Software launched Wolfenstein 3D on PCs. Unbeknownst at the time, it also kick-started an evolutionary tree for a series of game engines that are not just synonymous with id Software's subsequent Doom and Quake franchises, but are also famous for pushing the boundaries of graphics rendering.

Yes, we're talking about id Tech, and with Doom: The Dark Ages, we've now reached version eight. Of course, game engines are more than just about graphics, but it's arguably what made id Tech as famous as it is.

So let's take a journey through each successive version of the evergreen game engine, looking at the first game to use it, what made it stand out from the crowd, and take a quick browse through the work of other developers that licensed id Tech for their own projects.

id Tech 0 | Wolfenstein 3D (1992)

In 1992, id Software's programming duties were handled by just three people: John Carmack (graphics and runtime code), John Romero (tools used to make the game), and Jason Blochowiak (sub-routines). Together, they created what would eventually be called id Tech 0, though at the time, it was simply labelled 'Wolfenstein 3D engine'.

It wasn't the first game that Carmack coded to use 3D graphics, though he chose an unusual method for rendering the three-dimensional world: ray casting. In some ways, ray casting is similar to the ray tracing we see in the very latest games, but due to the limitations of gaming PCs in the 1990s, Carmack was forced to keep it very simple.

In Wolfenstein 3D, the graphics rendering starts with the engine creating the ceiling and floor via a flood fill, as two blocks of colour. Then one ray is cast or 'marched' out for every vertical line of pixels—for a resolution of 1080p, that would be a total of 1920 rays. Of course, in 1992, we're only talking about 320 or so rays!

Each ray traverses a two-dimensional map of the world and travels until it reaches an object, such as a wall or door. From how far the ray has travelled, the object's size is calculated and appropriately scaled, to give the impression of depth. A spot of clever math is also used to correct the perspective of the objects, otherwise, everything would look warped, fisheye lens style.

Once this is all done, the engine then colours and textures the column of pixels, before moving on to the next ray. After all of this has been achieved, objects such as enemies, ammo, and food are rendered in the form of 2D sprites.

We've skipped over a lot of the technical details, but if you're interested in understanding exactly how it all works, then grab a copy of Wolfenstein 3D Game Engine Black Book by Fabien Sanglard.

id Tech 0 would be licensed by a handful of other developers, with the most notable being Apogee Software, which created the Rise of the Triad, a game that we reckon everyone should play at least once.

id Tech 1 | Doom (1993)

For id Software's 1993 follow-up, the seminal Doom, Carmack created an improved rendering algorithm he developed for the SNES version of Wolfenstein 3D. Instead of using ray casting to calculate what should be displayed, Doom's engine (aka id Tech 1) handles all of this through a BSP tree or binary space partitioning tree.

Just like its predecessor, Doom's levels are all 2D maps, and a BSP algorithm is used to split them up into a data structure that makes it super quick to work out if something is visible or not.

Vertical surfaces are rendered first, such as walls and doors, by working through the BSP tree. This is followed by all the horizontal surfaces—ceilings and floors—that don't get rendered during the 'vertical' phase. 2D sprites are handled next, to fill the world with monsters, weapons, and ammo, before the final effect—the game's head-up display.

Just as for Wolfenstein 3D, Fabien Sanglard has written a superb breakdown of the making of Doom and how its engine fundamentally works. If you want to know exactly how BSPs are used in the game, then the Doom Game Engine Black Book is a must-read.

Doom's impact on the world of gaming is hard to overstate—not just in terms of shaping genres and game design, but also in how its engine would be licensed by other developers to create very successful games. Wolfenstein 3D and Doom's engines would only be used in a handful of other games, though the latter would be used to create the fantastic Heretic and its sequel, Hexen.

But things really changed for id Software's next big release.

id Tech 2 | Quake (1996) | Quake 2 (1997)

Given Doom's rampant success, it was always going to be a challenge to come up with something better. Fortunately, 1996's Quake would turn out to be just as influential as its predecessor. This was in no small part due to the new game engine that Carmack and co created, as id Tech 2 sported an entirely new renderer.

Instead of flat maps drawn to mimic three dimensions, then overlaid with 2D pre-rendered sprites, Quake's engine handled everything in true 3D. Just like most games do today, arrays of vertices were used to generate polygonal meshes, where the engine transforms and lights them, before rasterising them into pixels.

Textures are then applied to each polygon in one rendering pass, with pre-rendered lightmaps blended in a second pass.

The result was a world that you could look at and move in any direction, with surfaces no longer constrained to be full vertical or horizontal. Those pre-rendered lightmaps did a great job of giving a sense of realistic lighting and shadowing—for 1996, of course!

1997's Quake 2's engine was more of the same (i.e. still essentially id Tech 2), just with sensible improvements all around, such as using the OpenGL API for rendering, allowing for better graphics and performance on graphics cards that supported it.

But perhaps what made the engine really stand out is the number of other developers who licensed either and used it to create some outstanding games. Hexen 2 and Heretic 2, Malice, SiN, Kingpin, Soldier of Fortune, and Anachronox were all great games in their own right, though the real star has to be Half-Life.

Admittedly, Valve rewrote an awful lot of id Tech 2 for its masterpiece, as well as created its own tools and additional sub-routines, but if it didn't exist in the first place, would Half-Life have been as good as it was?

id Tech 3 | Quake 3 Arena (1999)

With Doom sidelined in favour of Quake, id Software chose a multiplayer focus for 1999's Quake 3 Arena, but that didn't mean Carmack gave up pushing the graphics envelope to new heights.

Quake 3 Arena's engine (they still weren't being labelled as id Tech at this stage) introduced three big improvements for graphics: spline-based curved surfaces, shaders, and a fast inverse square root function.

The first one solved the problem that 3D polygonal games suffered from at the time: how to make a surface look curved instead of angular. This was done by using Bézier patches to tessellate, or split up, a mesh into dozens more smaller triangles, which together give the impression of a curved surface.

Today's graphics cards can handle billions of triangles, but back then, it was a major breakthrough in producing ever-more realistic graphics at a playable frame rate. That said, Quake 3 Arena could still bring the most powerful of gaming rigs to its knees once the resolution and graphics settings were maxed out.

The shader system used in Quake 3 Arena was very much a precursor to the shaders used in today's games. Rather than just applying a base texture and lightmap to a polygon, the engine used small scripts to describe the properties of a surface. In this file, there would be information about textures, blending, volumes, emissivity, and even what audio files should be played if the player interacts with the surface.

And while he didn't invent the technique itself, Carmack's implementation of calculating the inverse square root of a number—that is, one divided by the square root of something—became famous in programming circles for being a masterpiece in optimisation and speed.

But what truly made id Tech 3 stand out was the sheer number of games created with it, and the list reads like a 'Who's Who?' of gaming. The two Star Trek: Elite Force shooters, two Medal of Honor games, the original Call of Duty, the Star Wars Jedi Knight duo, Soldier of Fortune's sequel, Return to Castle Wolfenstein, and American McGee's Alice, to name but a few!

For its next big release, though, id Software decided it was time to return to an old classic.

id Tech 4 | Doom 3 (2004)

Doom 3 appeared in 2004, and once again, id Software used a new engine, id Tech 4, to raise the graphics bar to a new level. Texturing was now far more advanced, supporting the use of normal and specular maps to improve the level of fine detail on surfaces.

To make this even better, though, Doom 3's renderer calculated all of the lighting on a per-pixel basis, rather than just colouring a triangle's corner (a vertex) and then interpolating that across the rest of the polygon.

Doom 3 also used stencil buffers for its shadow volumes, and together with the per-pixel lighting, the game's visuals were second to none. However, the game's performance wasn't so great.

The original Doom was all about frantic, fast-paced action, whereas Doom 3 was considerably slower—almost like an action, survival-horror game. But even so, the use of stencil buffers and the lighting algorithm demanded the very latest GPU features, leaving a lot of older hardware unable to run the game, and even with a high-end graphics card, the performance was never great.

id Tech 4 was less popular with other developers, too, and compared to Quake 3 Arena, only a small number of studios used it for their games. Still, it gave us the much-maligned Quake 4, the original Prey, Enemy Territory: Quake Wars, and the underrated Brink.

Five years after the launch of Doom 3, id Software was acquired by ZeniMax Media, which also owned Bethesda Softworks, and from that point onwards, id Software's engines would only be used by itself and ZeniMax's other studios.

Doom 3's engine would also be the last one that id Software ever released the source code for.

id Tech 5 | Rage (2011)

In 2011, id Software once again introduced a new engine with a new name, and even a new game IP. Rage was powered by id Tech 5 (the older engines were then retrospectively renamed in the same manner), and its main standout rendering feature was a technology called MegaTexture.

Technically known as virtual texturing (pdf warning), the method involves using incredibly large textures, up to 128,000 pixels square, where sections would be sampled to be applied to a polygon mesh. The idea behind this is that the main, massive texture would stay in system memory and only the parts required would be streamed into the GPU's local memory.

The benefit of this is that id Software could make one renderer that would work exactly the same, regardless of the hardware platform. Unfortunately, the reality was a whole heap of bugs and rendering glitches at launch, though when everything did work well, the texture quality was certainly better than many other PC games at that time.

It helped that id Tech 5 was packing a whole raft of other graphics tricks, such as soft shadows, HDR rendering, volumetric lighting, screen space reflections, depth of field, and motion blur. ZeniMax Media had two of its developers use id Tech 5.

MachineGames created the Wolfenstein reboots of The New Order and The Old Blood, and Tango Gameworks made The Evil Within series, albeit heavily rewriting id Tech 5 for the second title.

In 2013, John Carmack left id Software for new ventures, the last of the original team to leave, joining Oculus VR to help develop its virtual reality hardware and software.

id Tech 6 | Doom (2016)

It was a bold move to reboot such a well-known franchise, but that's exactly what id Software did, and in 2016, Doom returned and with one hell of an engine.

While he was still at id Software, John Carmack explored the use of ray tracing and voxels for the next id Tech iteration, but when he left, the developers stuck to a traditional polygonal mesh renderer with standard rasterisation to create id Tech 6.

MegaTextures were still in full use but higher in quality, and along with the entire dictionary of modern rendering techniques, the new Doom not only looked extremely good, but it ran very well, too.

It also had a couple of new tricks up its sleeve, such as temporal anti-aliasing, or TAA. This had already been around for a while, but id Tech 6 added the use of super-sampling of motion data to remove the shimmering and blurring that TAA is well-known for.

While it might sound like id Tech 6 didn't bring anything significant to the world of rendering, Adrian Courrèges' breakdown of the graphics in Doom (2016) shows just how complex and multifaceted everything is behind the scenes.

Sadly, id Tech 6 would only be used two more times, by MachineGames again, for two more Wolfenstein games —The New Colossus and Youngblood. The latter was later patched to support Nvidia's RTX technology, though, as it was an early implementation of ray tracing, the results didn't justify the performance hit.

id Tech 7 | Doom Eternal (2020)

For 2020's Doom Eternal, id Software took id Tech 6 and gave it a thorough overhaul, removing the OpenGL code in favour of Vulkan-only, as well as dropping MegaTextures.

By now, id Tech was running as a heavily multi-threaded engine, and instead of using a primary thread to handle the rendering, multiple threads were tasked to operate in parallel, handling shaders, asset streaming (geometry and textures), and data decompression.

The end result is a game that has vastly more detail than its predecessor and even more accurate lighting and shadows, and countless more particles for explosions and gore. And remarkably, it runs even faster than id Tech 6.

A screenshot of the PC version of Indiana Jones and the Great Circle, with path tracing enabled in the graphics settings.

(Image credit: Bethesda Softworks)

A year after launch, Doom Eternal was updated to support Nvidia's DLSS AI-powered upscaler and frame generation technologies, along with ray-traced reflections, to allow all surfaces to reflect the environment correctly.

Ray tracing would be used exclusively for lighting and shadowing in the only other game to use id Tech 7, Indiana Jones and The Great Circle—MachineGames, once again—and this almost certainly laid the foundation for the next iteration of id Tech.

id Tech 8 | Doom: The Dark Ages (2025)

And so we come to Doom: The Dark Ages and its id Tech 8 engine—a prequel to the 2016 Doom game but with all the graphical enhancements that MachineGames' Indiana Jones offered.

With ray tracing used all the time, and with no shader-based fallback system for GPUs without ray tracing hardware, id Tech 8 was never going to be as speedy as its predecessor, but it still runs pretty well, all things considered.

We're still learning more about what id Tech 8 has underneath the hood, as id Software has been pretty quiet about its new engine, but we do know that Doom: The Dark Ages will be patched in the near future to support path tracing.

This is a method of ray tracing that produces the most physically accurate lighting, shadows, reflections, and refractions, but as any PC gamer who's tried will tell you, the hardware demands are incredibly high.

Naturally, it will mean that upscaling—and perhaps even frame generation—will be an absolute must, and it also means that id Software will need to ensure id Tech 8's denoiser algorithm is top-notch, too.

The slower pace of Doom: The Dark Ages compared to Doom 2016 and Doom Eternal means that the outright frame rate won't be quite as important as it is in those games, but given id Software's track record of developing extremely performant engines of late, we're pretty confident that path tracing will be worth enabling.

Assuming you have the PC hardware for it, of course!

What's next for id Tech?

(Image credit: Bethesda Softworks / id Software)

With a gap of four to five years between each release of a new id Tech engine, we're obviously not going to see another one until the end of this decade. That's sufficient time to allow for two more generations of new GPUs, and if you compare today's graphics processors to those from 2020, we can maybe judge what id Tech 9 will bring to the table.

AMD and Nvidia's latest chips directly use AI to improve performance and image quality, through upscaling, frame generation, and denoising, but the latter also introduced the concept of neural rendering with the launch of its RTX 50 series.

Five years ago, AMD launched its first ray-tracing capable GPUs—the Radeon RX 6000 series—and Nvidia released its second generation of RTX chips, the 30 series. Back then, there was no frame generation, no AI-based denoising. Neural rendering was still in its infancy of research.

This suggests that by 2030, id Tech 9 could be making use of AI to improve the quality of its graphics, to make them ever more realistic, all while keeping the performance as high as possible. The next Doom or Quake game could be permanently path-traced, perhaps always using upscaling, too.

But whatever we do get, id Software's long history of creating graphically-intensive games means that we can be sure that we'll be in for a visual treat.

Although the id Tech engine is no longer the one to watch for new rendering technologies—everything in Doom: The Dark Ages has been done before—it still sets the bar when it comes to optimised performance.

Now, how many game engines can you say that about?

]]>
https://www.pcgamer.com/hardware/a-graphical-history-of-id-tech-three-decades-of-cutting-edge-graphics-and-game-engine-technologies/ tFJFJtwjUAw4XQej2pYzWK Fri, 30 May 2025 14:53:33 +0000
<![CDATA[ Phanteks Eclipse G400A review ]]> Phanteks has put together a mighty strong proposition in the Phanteks Eclipse G400A. It's affordable, easy to build into, and importantly comes with four large fans for plenty of airflow.

To give the Eclipse G400A a proper test, I set about building an affordable PC inside it. Easier said than done in today's economy, but I just about managed to get it done for a reasonable budget. You can read all about that in our budget build guide, but I took away from that experience a very positive outlook on everything the Phanteks G400A has to offer.

The G400A's frame is machined well with few sharp edges and clean corners. The case I received had no visible marks on any of the panels, which you'd think would be a given with a brand new case, but scratches can and do happen in transit with poor packaging. There's only a small box of accessories included with the G400A, though Phanteks has been extremely generous by putting 30 zip-ties in there—count 'em, 30! It also has the prerequisite screws for the motherboard, PSU, HDD, and a couple extras.

It's a similarly priced unit, at $110/£85, to the Corsair Frame 4000D with fans. Both have their pros and cons. The Corsair is a better-looking case in my opinion, though it also has panel flex (which I'm told might go away soon as Corsair moves to thicker metal). The Phanteks is built like a tank and has one more fan than Corsair and they're all the larger 140 mm variety.

Eclipse G400A specs

A gaming PC sat on a desk with purple RGB lighting on the fans and light bar enabled.

(Image credit: Future)

Form factor: Mid-tower
Motherboard support: E-ATX (up to 280 mm), ATX, Micro-ATX, Mini-ITX
Storage bays: 2x 2.5-inch, 2x 3.5-inch
Front IO: 2x USB Type-A, 1x USB Type-C, 3.5 mm, power + reset
Fan support: 7x 120 mm/6x 140 mm
Radiator support:
360 mm max (top), 120 mm max (rear)
GPU clearance: 415 mm
Weight: 8.63 kg
Dimensions: 495 L x 230 W x 522 H mm
Price: $110/£85

Where this case really shines as an affordable, almost budget, chassis is in fan selection. There are four M25-140 Gen2 D-RGB fans included on the G400A. These are not your basic case fans. Case in point, a single fan would cost you around £10 if you were to buy it separately. These are some of Phantek's latest, with strong stats and lit up to the nines with RGB-lit blades and an infinity mirror effect on the central fan hub.

These fans are connected together using a proprietary connector, which combines PWM fan control and RGB control together. It's pretty easy to daisy chain more of these fans together, terminating in the standard PWM and A-RGB headers using an adapter cable, though you shouldn't have to worry about that. The case comes pre-wired and ready to go, and the four fans fill out the case with enough cooling that you shouldn't have to worry about installing any more.

A dust filter covers the front of the case for all three fans. It has a relatively small impact on the airflow through the front of the case when installed, however, once combined with the mesh front panel does stack up to become a bit more of a hindrance. Using an anemometer, I measured airflow to be at around 1.9 m/s with the dust filter and front panel fitted, 2.2 m/s with only the dust filter, and 2.3 m/s without either. That's more than some I've tested, but even so, I'm not too worried about airflow in this case with those three 140 mm fans loaded in the front.

I opted for an air cooler for my test build, Arctic's Freezer 36, though you could just as easily fit up to a 360 mm radiator inside the top of the Phanteks G400A, and without moving around any fans. The top mesh panel pops off the case for easier access and there's easily enough room for a standard thickness radiator between it and the top of the motherboard once fitted. The Phanteks is quite generously sized all-round at 495 L x 230 W x 522 H mm. It didn't feel quite as spacious as the Be Quiet! Shadow Base 800 FX, but it's as easy to build into.

Along the bottom of the case is a PSU shroud with plenty of room for cables. The PSU sits in the rear and in front of it is the storage bracket for any 2.5-inch or 3.5-inch drives you might have. I didn't have any, which means I could have removed this storage bracket entirely if I wanted to, but I had such a gluttony of room under the shroud that I left it in.

The PSU shroud is where Phantek's sought to apply the only dab of flair to the entire build, bar the RGB fans. There's a light strip that runs the length of the PSU shroud. It's diffused, which looks so much better than a strip with obvious LEDs, and it's easily controllable in-line with the fans through the motherboard. In my case, via ASRock's RGB software.

The cable management on the rear is a real boon to the G400A, too. It's a smart system that ditches the usual channels and ties for a velcro loop system. It sounds pretty pedestrian, but the way the velcro loops around makes it much easier to thread through new cables or remove it and replace, without undoing all your hard work cable managing. This is great for beginners, especially if you're prone to running a few extra cables or reworking your system to get it just right.

Buy if...

✅ You want top airflow for less: This case comes with four M25-140 Gen2 D-RGB fans, and if you ignore most of that name, the important bit is the '140', for 140 mm. That means slower spinning fans, less noise, but plenty of airflow.

Don't buy if...

❌ You want the *aesthetic*: The G400A is a pretty standard-looking case. There's an RGB light strip, sure, but otherwise it's something you stuff parts into and game on. No frills, no shiny glass box, both of which you could get for a little more money.

My only concern with the rear cable management is that the right-most side of the chassis, traditionally where your CPU power cables go, only offers a few tabs to loop a zip-tie around to secure any cables going that way. But it's times like these where it's important not to forget the 30 zip-ties included in the box of those in the box so you can cut and reconnect many times over without running out.

The G400A is certainly not the flashiest case on the market. You could spend a similar amount on just a case, no fans, on the likes of the Phanteks Evolv X2 or Lian Li O11 Vision Compact. Both of which are some of the best-looking glass boxes around. But you won't quite get the affordability of four 140 mm fans included in the box, and that's what Phanteks has got absolutely right with the G400A. It's a wonderfully easy option for first-time buyers.

That's really the strongest argument for the G400A: if you're set to brave your first PC build ever (don't worry, it's not that bad), this is a superb choice. You have all the case cooling you need already wired up and ready to go, there's ample room for a power supply and no tight connections or corners to work with in the rare occurrence you have to troubleshoot.

]]>
https://www.pcgamer.com/hardware/pc-cases/phanteks-eclipse-g400a-review/ JZyrkhUFoJKdp2otxrRrfU Fri, 30 May 2025 10:59:26 +0000
<![CDATA[ Nvidia's Jen-Hsun Huang says Chinese competitors are 'quite formidable' just days after the announcement of a Chinese RTX 4060-level GPU powering up ]]> US export controls have increasingly stopped GPUs with certain AI capabilities from getting into China, and just this week, Nvidia stated the US's ban of H20 chips into the country meant a 'multibillion-dollar write-off' for the tech company. Though these protectionist policies from the US are not keeping China from building its own GPUs, as a new China-made GPU has powered up.

Lisuan Technology, a Chinese startup, has been developing the "first self-developed architecture and fully independent intellectual property GPU chip", and it turned on this week. This was announced in a recent WeChat post, alongside additional guarantees to "carry out detailed and comprehensive software and hardware testing and driver optimization work."

According to Tom's Hardware, the 6 nm GPU chip is targeting RTX 4060-level performance, and is currently titled the G100. Lisuan has reportedly been working on the G100 since 2023, with plans to launch it in 2023, so there is a level of skepticism around whether or not it can actually hit that RTX 4060 performance level.

Despite being a budget card from the last generation, the RTX 4060 is still an impressive card, built on the 5 nm process from TSMC. The smaller the process, the higher the density of transistors, and this results in better performance and efficiency.

Effectively, it's harder (and sometimes impossible) to get the same performance out of older processes.

Nvidia GeForce RTX 4060 display outputs

MSI's RTX 4060 (Image credit: Future)

The G100 powering on is a good sign for the card, but it's the first of many steps before it can actually see a launch into the Chinese market. Further optimising and just plain testing is needed, especially as it's targeting "the needs of desktops, notebooks, graphic workstations and other devices."

Huang tells Bloomberg, "The Chinese competitors have evolved" and Huawei, with its new AI chips, has become "quite formidable." Given that the domestic ability to create processes for the G100 is quite limited, it is likely that Huawei's Ascend 920 and Lisuan's G100 are using silicon from the same Chinese foundry: SMIC. This is all according to estimations from Tom's Hardware.

According to Huang, “Like everybody else, they [Chinese companies] are doubling, quadrupling capabilities every year." This is all to build towards a central point that Huang wants "all of the world's AI researchers and all of the world's developers to be building on American stacks".

This is reportedly "irrespective of the near-term revenue success that we have", though opening up sales to a bigger market surely can't hurt the hardware giant.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/graphics-cards/nvidias-jen-hsun-huang-says-chinese-competitors-are-quite-formidable-just-days-after-the-announcement-of-a-chinese-rtx-4060-level-gpu-powering-up/ yGYWBtE5NRYLyHrrfuWron Fri, 30 May 2025 10:56:44 +0000
<![CDATA[ Is it a chair? Is it a PC? Actually, this stealth PC build is both—and probably very toasty ]]> I'm not going to pretend my home office was ever tidy but since joining the hardware team, my far-from-spacious Micke desk from Ikea has been besieged. Even something as straightforward and necessary as upgrading my desktop setup for work has only served to highlight just how little space I've got to play with—mind, all of my Miku Hatsune figures probably don't help the space equation, but they really are non-negotiable. If some out-there tech maker was to, say, rock up with a radically space-saving setup, I'd be all ears.

Well, the latest project from YouTuber Basically Homeless has mined a space-saving solution from somewhere you'd find in almost every PC gaming setup—namely, the chair (via Hackaday). But this project doesn't just slot a Raspberry Pi behind the headrest and call it a day; the goal was a largely invisible, "no compromises setup" with an allowance for only one trailing wire.

Using the C7 Max premium ergo office chair from FlexiSpot as a base for the PC's unlikely case, the tech creator got to work sourcing desktop-grade parts small enough to fit somewhere inside.

Drilling holes throughout the wheel base, not to mention feeding the power cable through in such a way that it wouldn't immediately get snarled up by the chair's castors, was arguably the easy part. Beyond simply using small form factor parts, the tech creator also had to figure out where the parts were going to even fit. After some intense Tetris-ing, the parts found their home right under the seat.

50 mm aluminium spacers support the wooden base of the seat, while allowing for a very narrow amount of space below for an AMD Ryzen 7 9800X3D CPU, a low-profile Nvidia GeForce RTX 4060 GPU flipped sideways, and 64 GB of DDR5 RAM. But rather than leave this hefty hardware exposed, the maker instead spent months 3D modelling and printing a bespoke housing for it. You can download the files for it from the free tier of the creator's Patreon if you also happen to want to make your own FlexiSpot C7 Max stealth PC.

The (far from) final assemblage of the project just underlines how niche and impractical it is for most—especially when the YouTuber's heavily-used Asus GPU decides not to power on properly, and the chair had to be pried apart to swap it out. Anyway, trials and tribulations aside, the PC chair does eventually power on fully, ensuring toasty buns for every gamer that perches upon its reasonably powerful seat.

If you fancy crafting your own small form factor PC build—chair-shaped or otherwise—a good place to start is with our guide to the best Mini-ITX motherboards. If, like me, you simply enjoy bearing witness to the hard work of others, may I recommend angling your peepers to this incredible 3D printed Palico PC case that was on display at Computex 2025? You're darn tootin' I'm going to keep bringing that one up at every opportunity. If you're still looking to upgrade but in the market for something much less showy, you can take a gander at our best mini PCs guide instead.


Best gaming PC: The top pre-built machines.
Best gaming laptop: Great devices for mobile gaming.

]]>
https://www.pcgamer.com/hardware/pc-cases/is-it-a-chair-is-it-a-pc-this-stealth-pc-build-is-both-and-probably-very-toasty/ neJ6njpRrLnkvFuYLRPgtT Thu, 29 May 2025 15:43:01 +0000
<![CDATA[ If you're getting a blank screen on reboot with an RTX 50-series GPU, use this Nvidia firmware updater ]]> If you've just bought an RTX 50-series card and are experiencing a blank screen upon system reboot, Nvidia has pushed out a GPU UEFI Firmware Update Tool that should fix the problem. And while the company only advertises it for RTX 5060 and RTX 5060 Ti cards, it looks like it is reportedly compatible with other RTX 50-series GPUs. Though I would hold off updating unless you're having the blank screen problem.

Nvidia says: "To ensure compatibility with certain motherboards, an update to the NVIDIA GPU firmware may be required. Without the update, GeForce RTX 5060 series cards [5060 Ti and RTX 5060] in certain legacy motherboards could experience blank screens on reboot. This update should only be applied if blank screens are occurring on reboot."

Although Nvidia only speaks about the RTX 5060 and RTX 5060 Ti, an Nvidia support representative told one forum user that it should support all RTX 50-series cards and that it's "for all vendors". The Nvidia rep also says that "having it updated is better" than not updating. The forum user says they've used it on their RTX 5080 and have so far had no issues.

The Nvidia support agent explains that this firmware update shouldn't improve general stability, performance, or thermals, as it's only designed to resolve black screen issues caused by some motherboard compatibility issues. Obviously, to the extent that it resolves those issues stability will be improved (not having a black screen on reboot certainly seems like a big stability improvement), but that seems to be the only real benefit to this firmware update.

Given that, and despite the support agent saying it's "better" to have the firmware updated, I'd personally not bother installing it unless I was experiencing issues. Firmware updates are mucking around at the lowest level you can, really, unless you get out the screwdriver, and carry a risk if something were to go wrong. Given there seems to be no benefit for those not experiencing black screen issues I'd give it a pass unless it's required.

MSI RTX 5060 graphics card

(Image credit: Future)

I must say it's also somehow not too surprising that it's the RTX 5060 and RTX 5060 Ti that Nvidia says is experiencing these issues—well, the former isn't surprising at least. Nvidia didn't even have the right drivers ready for us here at PC Gamer, nor for many other reviewers, until launch day. (That was a joy for our hardware commander Dave to deal with.)

Saying that, higher-end RTX 50-series GPUs haven't been immune from issues that are resolvable by driver update, either. And not just stability ones, but performance ones, too. Nvidia's recent driver update that gives some RTX 5080 and RTX 5090 laptop GPUs a performance boost springs to mind.

At the end of the day, it's up to each individual whether they decide to install the firmware update for that extra motherboard compatibility. If you're getting black screens on reboot with an RTX 5060 or RTX 5060 Ti—or a higher-end RTX 50-series card, for that matter, I suppose—I don't doubt the answer will be 'yes'.

To get it, go to Nvidia's relevant support page and hit the download link, then run through the app's update process.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/graphics-cards/if-youre-getting-a-blank-screen-on-reboot-with-an-rtx-50-series-gpu-use-this-nvidia-firmware-updater/ 6Luta6eULXPiQ6ckjaMMS Thu, 29 May 2025 15:15:41 +0000
<![CDATA[ Yes a US court has blocked Trump's tariffs but PC hardware isn't out of the water just yet ]]> Another twist to the back-and-forth, up-and-down tale of tariffs: A US court has just ruled that President Trump doesn't have the right to impose the 'liberation day' tariffs. But before we get our pint glasses ready to cheer for computer hardware prices, and to burst a bubble before it inflates too much, note that this ruling doesn't apply to semiconductors and other such sector-specific tariffs.

According to Reuters, the US Court of International Trade has blocked most of Trump's tariffs by ruling against his use of the International Emergency Economic Powers Act to justify them. Trump had used this act to claim the new tariffs were justified under national security.

As a quick recap, at the start of April, President Trump announced big 'reciprocal' tariffs against various countries, especially China. Since then, there's been so much back-and-forth and uncertainty that it's hard to keep up—you can read my summary here if you're interested.

In short, though, while the bulk of these reciprocal tariffs had been rolled back and paused, there was still a looming threat that they could be imposed again, and of course, they still had an impact while they were in effect. Regarding this latter point, the BBC notes that if this ruling is upheld, businesses could receive refunds with interest on what they've been tariffed.

There is most certainly an "if" in play here, though, because the White House has appealed the ruling, and it must go through higher courts, potentially all the way up to the Supreme Court.

Back to the now, though: This shouldn't affect PC hardware. Heavy tariffs can still be levied against semiconductors and other derivative products because these are sector-specific tariffs that have supposedly had more consideration going into them than a blanket country-wide 'reciprocal' tariff.

It's still unclear what the US will do with semiconductors and other downstream computing products, as the administration is still looking into the effects on national security of chip and computing imports. While we've already seen PC gaming hardware companies affected by tariffs, it's hard to make any solid predictions about what will happen moving forward.

Whatever does happen, it seems like this particular ruling won't have a direct impact. Though it's always good to remember the interconnected nature of the global supply chain. Even if chips and some other computing products themselves can still be tariffed to the sky, the lowering of other blanket tariffs could still have a positive impact on prices.

And none of this is to mention the impact of market speculation. We've seen good signs on that front, but these seem to have fizzled out somewhat already. Beats me—I'll leave all that to economists and those on Wall Street.


Best gaming PC: The top pre-built machines.
Best gaming laptop: Great devices for mobile gaming.

]]>
https://www.pcgamer.com/hardware/yes-a-us-court-has-blocked-trumps-tariffs-but-pc-hardware-isnt-out-of-the-water-just-yet/ inASpQqoVQ6EWkhZbJQ8FX Thu, 29 May 2025 14:06:14 +0000
<![CDATA[ GeForce Now finally gets a native app for the Steam Deck, solving what was our biggest complaint with the streaming service on Valve's plucky handheld ]]> GeForce Now is available on Steam Deck… uh, didn't we know that already? Sure, the process is a bit fiddly, but at least Nvidia put together a script last year that somewhat streamlines the installation process—pretty handy considering not everyone has the cash to splash on a GeForce RTX 4080 GPU for their rig. But wait, there's now an even better way to play.

That's right, GeForce Now is finally available as a native Steam Deck app. I wrote about the initial announcement back in January, but I'm sure Jacob from almost two years ago feels very vindicated. It's been a long time coming for sure, but from today, GeForce Now members can stream over 2,000 titles from the service's cloud library straight to their Valve handheld.

This Steam Deck app replaces the previously browser-based workaround for a game streaming experience that leverages the handheld's portable form factor while also being surprisingly kind to its battery. Okay, so you still have to jump through some hoops to download it, detailed below. Still, it is a dedicated app that frees you from a Chrome browser-based experience—and I personally don't mind a little additional hand-holding through the process either.

I've tried the application on Steam Deck and it seems to work just fine, though the installation process for me pre-release will be a little different to the now available release.

Here's how to install the GeForce Now app on your Steam Deck:

  • Hop into desktop mode on your Steam Deck (Steam, power button, 'switch to desktop')
  • Make your way over to the GeForce Now website and look for Steam Deck on the download page
  • Download the app
  • Follow the on-screen installation instructions
  • Job done

To be clear, GeForce Now is a streaming service for games you already own via other digital storefronts like Steam; a GeForce Now membership alone isn't going to grant you access to a suite of new games—buuut you can buy compatible titles you don't already have via this new app. Steam sales are already dangerous for my bank balance, and the GeForce Now app promises to offer no respite…

GeForce Now subscription pricing tier information in USD.

(Image credit: Nvidia)

The GeForce Now Steam Deck app promises game streaming at up to 4K resolution and 60 fps, while also freeing up your handheld's hard drive space and allegedly saving close to 50% battery life compared to gaming natively. Premium GFN members—that's folks paying upwards of ten bucks a month—also enjoy HDR10 support in both handheld and docked modes (not to mention expanded mod support, as that's another story our James jumped into back in January).

Support for up to 90 fps on Steam Deck is also in the works. At present, you'll only get streaming at up to 1440p 120 fps when the Valve handheld is hooked up to a monitor, and up to 4K 60 fps if you connect your Steam Deck to a TV.

Ultimate tier GFN members—that's folks spending about 20 bucks a month—are pitched a cloud-based streaming performance comparable to a GeForce RTX 4080 GPU desktop. Our Dave is a fan. Between the price of a brand new Steam Deck (or a refurbished handheld if you can find one) plus perhaps a slightly less shiny GFN membership, and the price of an even pretty decent GPU, I think I know which one I'd prefer. Whether my notoriously dodgy router would like it, though, remains to be seen.


Best handheld gaming PC: What's the best travel buddy?
Steam Deck OLED review: Our verdict on Valve's handheld.
Best Steam Deck accessories: Get decked out.

]]>
https://www.pcgamer.com/hardware/geforce-now-finally-gets-a-native-app-on-steam-deck-solving-our-biggest-complaint-with-the-streaming-service-on-valves-plucky-handheld/ RN6nYJJ6VEBiUqaaefwmdV Thu, 29 May 2025 13:00:00 +0000
<![CDATA[ The RX 9070 XT might not be the truly mid-range graphics card I'm still dreaming of but it turned my latest sci-fi PC build into a frame rate menace ]]> Oh, please let it end. Please. I was so excited to start writing these features. The plan was simple enough. Every other month, I pitch a new and exciting PC build with the latest and greatest hardware available, do an in-depth guide, cover the part selection, take some lovely amateur photography, and everyone's happy—at least that was the theory. Then Nvidia "launched" its RTX 50-series, and the whole market went to hell in a handcart. Graphics card? In stock? What's that? It's basically impossible to find anything at its retail price at this point, and you know, available. You have to sit on lottery lists, Discord stock channels, and pre-order promises to even be in with a whiff of a chance of picking up a graphics card at its retail listing.

When AMD's RX 9070 XT launched, I was super pumped. An affordable graphics card that delivered on its promises with a ton of stock straight out of the gate and a retail listing of $600. Yes, AMD, yes. Let's go. Honestly, it sounded incredible, and for a few weeks it was true. So, as I did, I priced up a build, requested in the parts, and then, three weeks later, no stock, price bump, and it's almost twice the cost. Nice.

Because there's so little stock of anything, everyone is buying everything they can—enthusiasts, industry, scalpers, all of em. Including the RX 9070 XT. It doesn't matter if it's the latest and greatest or a two-year-old has-been; if you see a GPU in stock somewhere at a reasonable price, you're probably dreaming.

That was kind of to be expected, of course; the 9070 XT is a good GPU, and if you can get one, it performs incredibly well, butting heads quite easily with Nvidia's RTX 5070 Ti, and its ilk, and it particularly dominates at 1440p. Yet, the cheapest I could find it with an "available soon" tag at time of writing was $860. Only 260 bucks more than its launch price.

Anyway, if you're reading this in the future and the tariff turmoil and pathetically poor stock situations have somehow managed to miraculously resolve themselves, (or you live in the UK or elsewhere with reasonable prices and actual product availability), let's talk details. This is meant to be a "mid-range" GPU build (if you can call $600 mid-range in the modern era), and there's a lot to discuss. As PC goes I've taken some liberties on the parts, to really push the Evolv X2 and its subsequent components to the limits. Best gaming PC you can buy today? Not quite, but it could be with just a few small changes.

The Specs

Component

Model

US Price

UK Price

CPU

Intel Core Ultra 7 265K

$298

£315

GPU

Asus Prime OC Radeon RX 9070 XT 16 GB

$860

£690

Chassis

Phanteks Evolv X2

$170

£129

Motherboard

Asus ROG Maximus Z890 Hero BTF

$620

£650

Memory

Corsair Dominator Titanium DDR5-7200 32 GB

$205

£180

SSD

Samsung 9100 Pro 4 TB PCIe 5.0

$500

£459

CPU Cooler

NZXT Kraken Elite 360 RGB 2024 AIO

$314

£280

Cooling

4x Lian Li Uni Fan TL Wireless 120 mm, 3x Lian Li Uni Fan TL Wireless Reverse 120 mm

$261

£189

Power Supply

Phanteks AMP GH 850

$279

£105

Total: $3,507 / £2,997

It might not look like it, but I have tried to tone down some of the component selection here compared to my last RTX 5090 PC build. And, well, I kinda got a little sidetracked. That's namely because of Phanteks' latest chassis, the Evolv X2.

I've long been a fan of its Evolv line; it's almost always delivered on the quality front, and when I saw it was "BTF" compatible as well, well, I just had to try it out in that configuration. Particularly given how gorgeous the press shots looked. The only problem is that that motherboard is 20% of the total build budget on its own. Yeah…. That said, there are a lot of parts that you can cut back on without necessarily impacting overall performance, and I'm going to go into detail on that in just a moment. Still, it's a fairly big price tag up there, I'll admit, certainly for what the RX 9070 XT is meant to represent.

The core componentry that's driving performance in this rig (not including the case, the power supply, and the cooling) comes to around $2,511. There's some corners you can shave off cost here to bring the overall price down, and of course $1,000 on the tertiary stuff can be brought down too, so it's not all terrible news here.

A gaming PC lit up with neon-style lighting and using an RX 9070 XT graphics card.

(Image credit: Future)

For the processor, I've gone with Intel's Core Ultra 7 265K. Despite having a particularly rocky launch, since its debut, Intel and Microsoft have really ironed out the bugs when it comes to its overall performance. On the whole, it's now an incredibly well-rounded chip and surprisingly efficient, with plenty enough grunt to handle the RX 9070 XT's frame-generating shenanigans.

Intel's done away with multi-threading and instead is opting for a single-thread solution throughout, utilizing eight Performance cores and 12 Efficient cores for the Ultra 7. It taps out at about 5.5 GHz under load and comes with a healthy 36 MB of L2 cache, all built off the back of TSMC's N3B manufacturing process. At $339, it's a relative bargain compared to the Ultra 9 285K's $590 price tag. For $250 extra, all you get are four more cores and a slight bump to clock speed, and that's kind of it. Still, whether it's rendering or gaming, Intel's well-rounded Ultra platform should produce the numbers. I hope.

My first, ridiculously overpriced PC part pick is Asus's Z890 Hero BTF motherboard. $700. Yeah, ouch. The standard Hero comes in at $100 less than that, and to be honest, you can get some really clean boards with solid connectivity at around the $300-400 mark anyway for the Z890 chipset. Sadly, though, if you want BTF with Intel's Ultra line and you want to ditch those front-facing connectors, then it's the only one around right now. It's also extremely over-engineered for this rig, with a 22+1+2+2 power stage design, support for six M.2 slots (3x 5.0 and 3x 4.0), and an absolutely wild amount of rear I/O.

A gaming PC lit up with neon-style lighting and using an RX 9070 XT graphics card.

(Image credit: Future)

In fact, it's got the works: 2.5G and 5G Ethernet, WiFi 7, three Type-C ports (with various Thunderbolt 4 and USB 10 Gbps configs), plus eight USB Type-As and the usual assortment of odds and ends. It's massively overkill for what we've got going on here, but y'know, needs must. Oh, and it's also got AI everything, because it's 2025 and AI, obviously.

For the CPU cooling, I've equally gone for something a little over the top in the form of NZXT's Kraken Elite 360 RGB (the 2024 edition). This is an incredibly interesting product. Somehow, NZXT has managed to circumnavigate Asetek's rather aggressive pump patent using a "custom-designed NZXT Turbine pump," solution that supposedly delivers 10% improved flow rate. Not going to lie, pretty impressed by that. Not the pump delivering more coolant, but the dodging of Asetek's legal team. On top of that, the 2024 edition also comes with a far fatter 2.72-inch IPS LCD display (640x640 @ 60 Hz), and all of the cabling, power, and USB connectivity is handled by a breakout cable running from a controller directly attached to the radiator itself. Lovely stuff.

Downside? Very expensive. $320 expensive. That's even pricier than Tryx's Panorama 360 with its curved OLED display. Still, it does look incredible.

A gaming PC lit up with neon-style lighting and using an RX 9070 XT graphics card.

(Image credit: Future)

On to RAM, and I've decided to go with something a little different, and again more to lean into the aesthetics of the overall system rather than cost-efficiency. That's going to be Corsair's Dominator Titanium memory kit, with 32 GB of the good stuff, at 7200 MT/s C34. That's a real tight real-world latency on that, and pretty much as good as you can get these days. Plus, they look stellar, and I'm a big fan of the overall design. Still, this is one of those areas you can save a bit of cash. Drop down to a comfortable lower-spec 6,400 MT/s kit at the $100-120 mark and you'll be laughing (it doesn't have to be Corsair, don't @ me).

For storage, I've forsaken my twin SSD mantra and opted for a single large drive instead. I have split this into two partitions to try and make my life a little easier, opting for 1TB for the OS and 3TB for the game and backup storage. That's all inside of a 4 TB Samsung 9100 Pro. It's a seriously rapid PCIe 5.0 drive, at least on the sequential front, and, although not exactly super quick on the random 4Ks, it does deliver some solid performance in games nonetheless. It is pricey, though, and again, if you dropped it to something a little cheaper, maybe a 2 TB Crucial T500 for your secondary storage and a 1 TB T700 for your OS, you'd be saving around $200 on the equation, with little reduction in overall performance (albeit a 1 TB loss to total storage space).

But we're here for the GPU, right? AMD's RX 9070 XT delivers some seriously strong ray tracing performance and kicks the RTX 5070 Ti to the curb for a, sort of, lower price (if you can find it). It's built off the back of AMD's Navi 48 XL GPU die, on TSMC's N4C manufacturing process (confusingly a 5 nm solution), and comes complete with 16 GB of GDDR6 VRAM on a 256-bit bus, delivering around 640 GB/s of total bandwidth. TGP isn't terrible, at 304 W, but the Asus Prime OC unit I'm using here does require no less than three 8-pin PCIe power connectors, which honestly makes me long for 12VHPWR again (more on why that is in a moment).

A gaming PC lit up with neon-style lighting and using an RX 9070 XT graphics card.

(Image credit: Future)

Otherwise, it's a killer card, making a serious case for itself being one of the best graphics cards of 2025, delivering easily at 4K, with close to 40-60 fps in even some of the most aggressive titles out there, without FSR assistance. I've also gone ahead and included one of Phanteks' premium vertical PCIe 4.0 GPU brackets with this too. To be clear, you do not need this in your build, and it does add an additional $100 onto the cost above. But! You can angle your graphics card with it (if you've got the cable space), plus it comes with a strip of RGB that perfectly matches the Evolv X2's stylization.

Speaking of the Evolv X2. This thing is stunning to look at. There's something about its monolithic, obelisk-esque shape that just draws the eye. A mixture of curved plastic stylized to look like brushed aluminum, tempered glass, fan recesses, and a clever plinth styling that really sets it apart from the crowd. It supports up to seven 120 mm fans (no love for 140 mm here) operating in a chimney-style cooling solution, drawing cool air up through the PSU shroud and the bottom of the chassis up into the main chamber and then exhausting directly out of the roof and the rear. One thing to bear in mind, however, the X2 absolutely isn't a liquid-cooling chassis. Certainly not in the same way its predecessors were. Perhaps it's a sign of the times and how far custom loops have fallen out of mainstream PC building culture, but it does only support a single 360 mm radiator in the roof, and that's it.

As for the final puzzle pieces here, for cooling I'm running seven of Lian Li's Wireless TL 120 mm fans, in all black (confusing to choose and install, but surprisingly effective and easy to set up), and for the power supply I've nabbed one of Phanteks' AMP GH 850 W units. According to PC Part Picker, at most the build should draw around 712 W from the wall, giving us around 16-17% headroom on that unit, plus the braided cables, 80+ Gold rating, and reasonable price point make it a sure-fire pick for a build like this.

The Build

The Performance

There's a lot going on here from a pure performance perspective. In many ways this machine is unreasonable. It's not something I'd actively recommend folk buy, which probably sounds quite counterintuitive given it's my job to write a compelling argument for this build I've priced together, but from a cost-efficiency perspective, it doesn't quite make sense. We all know that you don't pair an Intel Core i3 with an RTX 4090, for instance; it's just wrong, and in a lot of ways there are picks here that do echo that sentiment, albeit in perhaps a more gentle manner. A $700 motherboard, for instance, paired with a $324 processor and an $860 GPU with a 4 TB PCIe 5.0 SSD is, on the surface, unnecessary, and I'll hold my hands up and admit to that.

This was more about trying out a whole host of concepts in a single build rather than really ensuring the best bang for the buck PC possible. Does the Evolv X2 hold up with BTF? Is the Core Ultra 7 a worthy processor today compared to launch? Can the RX 9070 XT keep up with the latest and greatest Nvidia offerings? And just how good can you make this whole cornucopia of parts look together? Those are the questions I was keen to answer.

Wading through the performance metrics, and we can see that a lot of those questions do ring out true. The Ultra 7 is a dominant chip, with strong single-core performance in Cinebench and decent 7-Zip and Blender scores too. Far greater than when it first launched. That's a real shame, given right now it does feel like it could potentially have the chops to be one of the best CPUs you could buy. Certainly for the price. The RX 9070 XT likewise manages those 1440p titles very well, scoring on average 100 fps across all of our five games on test. Even Cyberpunk, with ray tracing ramped up and FSR disabled, landed 67 fps on the average frame rate. Chuck in FSR and frame generation as well, and scores easily shoot above 100 fps here too at 1440p.

The thing that impressed me the most, however, were the temperatures. I was apprehensive about it going into this. Particularly given the chassis design. Evolv's haven't exactly been known for top-tier airflow, and there's a lot of glass here. Combine that with the fact that those three intakes are drawing air in from a crowded PSU compartment, jam-packed with cables, with an overall limited number (by today's standards) of 120 mm fans, and well, on paper at least, it does seem like a potential disaster waiting to happen. Yet, no single component hit above 80°C. The Core Ultra 7 topped out at 79 degrees, and even under load in-game the RX 9070 XT still slid just under the 60°C mark. Similarly, VRM and SSD temps remained incredibly cool throughout. A true testament to the design of modern cooling solutions.

The Conclusion

A gaming PC lit up with neon-style lighting and using an RX 9070 XT graphics card.

(Image credit: Future)

So then, what can we take away from this build? This effervescent obelisk? Building in it with BTF is surprisingly well thought out. Even with stock parts and standard-length power supply cables, Phanteks has really gone to town on the cable management solutions here, and it really shows. The finished product, particularly with Lian Li wireless fans, looks an absolute treat. The only major downside was working around that limited power supply.

To be completely transparent, we create a similar version of this build for the print mag (ideally identical, but print deadlines are a nightmare at times). At the time of the photoshoot, our Asus Z890 Hero BTF board had yet to arrive, so I used the stock standard Z890 Hero variant we had in-house instead. That was great, except that board requires an additional 8-pin PCIe connector just below the 24-pin to even power on. The problem with that is that the RX 9070 XT also needs three, and the AMP GH 850 only comes with three singular 8-pin cables. There's no dual connectors or anything included in the box or any additional PCIe power ports on the PSU itself.

So, despite the system's power requirements being 16% lower than the PSU's rating, out of the box, you don't have enough connectors. In the end I bypassed this by using a larger 1000W model in the print version instead, but that is an additional cost that shouldn't really be necessary. Arguably, I'm not even sure the GPU really needs all that extra juice either. In my own testing, power draw never went above 589 W, for the entire system.

A gaming PC lit up with neon-style lighting and using an RX 9070 XT graphics card.

(Image credit: Future)

That's even more of an issue when you consider all the tertiary products now that require PCIe power as well. Not only do we have silly motherboards needing it to even boot (because apparently everyone needs to connect 36 fans off a single header these days), but you've also got things like Corsair's iCUE Link hubs and other third-party products drawing from the things too, despite USB-C power delivery clearly being a thing. It's frustrating, especially given how high costs for PC parts are these days. It's like we're being funneled into buying larger PSUs, not because we need to, but because the entire ecosystem requires it. Hyperbole? Maybe a bit, but if it's a sign of things to come, I ain't a fan.

The saving grace for this build? Well, rather weirdly, despite being $100 more, the BTF variant of the Z890 Hero doesn't have that PCIe power mobo connector that the standard Hero does, so in this case, you can save the cash and grab a lower wattage PSU instead.

All said and done, I love it; this tron-looking beauty is outstanding to look at, the Core Ultra 7 is incredibly dominant, and the RX 9070 XT (if you can find one in stock) is an awesome mid-range graphics card if you can get it at close to its retail price. If the price does drop to reasonable levels again, 4K gaming on a mid-range budget is slowly but surely becoming a reality. Until then, well, we'll just have to wait and see.

]]>
https://www.pcgamer.com/hardware/gaming-pcs/the-rx-9070-xt-might-not-be-the-truly-mid-range-graphics-card-im-still-dreaming-of-but-it-turned-my-latest-sci-fi-pc-build-into-a-frame-rate-menace/ J2es8LKzGmPxLoNiRGgxF Thu, 29 May 2025 12:13:08 +0000
<![CDATA[ Nvidia's Hopper GPUs are now dead to the Chinese market after export controls that made the company take a 'multibillion-dollar write-off' ]]> There was more than the usual swell of anticipation for Nvidia's latest earnings call, primarily because the last quarter has been tumultuous in the wake of US tariffs and trade restrictions. On this front, and despite the fact that the AI chip giant still seems to be doing phenomenally well, Nvidia has admitted export controls have fully killed off its Hopper generation GPUs in China.

During the company's recent Q1 earnings call, Nvidia CEO Jensen Huang explained: "The H20 export ban ended our Hopper Data Center business in China. We cannot reduce Hopper further to comply. As a result, we are taking a multibillion-dollar write-off on inventory that cannot be sold or repurposed. We are exploring limited ways to compete, but Hopper is no longer an option."

Hopper is the company's previous-gen GPU/AI accelerator architecture. While its Blackwell architecture—the architecture at the heart of the RTX 50 series—is rolling out to fill up data centres despite previous delays, Hopper chips still line many server racks and they were the primary Nvidia export to China.

The past couple of years have seen the same scene play out over and over again: The US restricts what Nvidia can export to China, Nvidia starts exporting a slightly less powerful Hopper chip to China, then the US restricts it further so that less powerful Hopper chip is restricted, too. Rinse and repeat.

No longer, though, according to Nvidia. Now, there is seemingly no less powerful chip that Nvidia can comfortably make and export to the country. Nvidia Hopper is dead in China.

Nvidia Hopper GPU die

Nvidia Hopper GPU die (Image credit: Nvidia)

Nvidia CFO Colette Kress says: "our outlook reflects a loss in H20 revenue of approximately $8 billion for the second quarter." H20 is the Hopper chip that Nvidia was previously exporting to China, and $8 billion revenue loss for Q2 is a lot more than the company lost for Q1.

Nvidia had previously said that it could lose $5.5 billion in Q1 because of export restrictions, but it looks like that amount turned out to be $2.5 billion in the end: "We recognized $4.6 billion H20 in Q1. We were unable to ship $2.5 billion, so the total for Q1 should have been $7 billion."

Despite praising President Trump's "bold vision", the company doesn't seem to agree with his trade restriction strategy in this case. Huang says: "The question is not whether China will have AI, it already does. The question is whether one of the world's largest AI markets will run on American platforms. Shielding Chinese chipmakers from U.S. competition only strengthens them abroad and weakens America's position."

Your next upgrade

Nvidia RTX 5090 Founders Edition graphics card on different backgrounds

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

We've heard Huang say similar before, and it's certainly an argument to take seriously. At the same time, though, we can hardly expect the CEO of a chip company to support the banning of its exports to one of its biggest markets.

The China export restrictions were certainly the main talking point in the earnings call, other than the usual "AI factory" stuff and a sliver of gaming talk. On that front, Nvidia claims a "record $3.8 billion" gaming revenue, but the wow-factor shrivels a little when we remember that Nvidia's pushed out a bunch of its new GPUs over a very short period, so we can expect an inflated number there. Nvidia all but admits this when it calls Blackwell its "fastest ramp ever"—that's "fastest", not "biggest".

Anyway, trade talk aside, Nvidia seems to be doing pretty well in the wake of this news. I'm sure the multi-billion company will survive Hopper waving farewell to China.

]]>
https://www.pcgamer.com/hardware/graphics-cards/nvidias-hopper-gpus-are-now-dead-to-the-chinese-market-after-export-controls-that-made-the-company-take-a-multibillion-dollar-write-off/ vtAohuGjPcKv5Bzmx7M5CG Thu, 29 May 2025 10:45:54 +0000
<![CDATA[ Texas Senate to vote on bill that restricts social media access for children, while parental consent for app downloads will be required from next year ]]> As is not unusual among folks of a certain age, it's hard not to wonder about the effects of unfettered internet access on my impressionable younger self. Ah well, back to doomscrolling and staring into the vast content pit it is. Wait, what was I doing? Oh, yes, the news!

Earlier this week Texas governor Greg Abbott signed into law a bill that will require both Google and Apple's app stores to verify the age of its users from January 1 (via Reuters). Once this law comes into effect in 2026, folks under the age of 18 throughout the state will have to get parental consent to download apps or make in-app purchases. Texas also has another bill awaiting a Senate vote that aims more specifically to restrict children's access to social media apps, too.

Apple and Google are understandably less than keen, arguing that the blanket age verification requirements overreach and making the case it's really only necessary for select apps. Apple issued a statement to Reuters, saying, "If enacted, app marketplaces will be required to collect and keep sensitive personal identifying information for every Texan who wants to download an app, even if it’s an app that simply provides weather updates or sports scores."

Still, this isn't necessarily a done deal. Last year in Florida, Governor Ron DeSantis signed into law a ban on social media accounts for anyone under the age of 14. This February, a judge considered blocking the ban amid concerns it would unconstitutionally curtail free speech. As of right now, the ban stands in Florida, but a similar free speech challenge could find its way to slowing down the Texas bills.

The Apple and Alphabet-backed Chamber of Progress already has something to say on that front. The group's vice president, Kathleen Farley, told Reuters, "A big path for challenge is that it burdens adult speech in attempting to regulate children's speech. I would say there are arguments that this is a content-based regulation singling out digital communication."

Utah was the first US state to pass an app store age verification bill into law back in March of this year. This followed laws directly addressing minors' access to social media back in 2023, though obviously concerns about young people's access to apps and social media more broadly has been bubbling the world over. For instance, last year Australia proposed a ban on social media for everyone under the age of 16 that will ultimately come into effect later this year. Tech-savvy teenagers across the land have likely already cracked a way to get around it.

As for the social media companies themselves, they've been surprisingly positive about these legislative developments stateside—though that's likely out of buck-passing relief. Meta, Snap, and X issued a joint statement in response to the Utah law's passing this year that said, "Parents want a one-stop shop to verify their child’s age and grant permission for them to download apps in a privacy-preserving way. The app store is the best place for it. We applaud Utah for putting parents in charge with its landmark legislation and urge Congress to follow suit."

Your next machine

Gaming PC group shot

(Image credit: Future)

Best gaming PC: The top pre-built machines.
Best gaming laptop: Great devices for mobile gaming.

Though I see the free speech argument, I only trust each of the big tech companies tangled up in this as far as I can throw them—to say nothing of the state of Texas pulling the 'think of the children' card. I'm not going to stand here and pretend I only ever had positive experiences with social media as a young'un, but it would also be remiss to not acknowledge how it opened up my world when the walls of my day-to-day looked miserably narrow. Age verification and a blanket ban would've protected me from some things, while also potentially reinforcing how hopeless I felt…if I didn't bother to figure out how to sideload my apps or otherwise circumnavigate the need for age verification.

The trouble with bans, in my humble opinion, is that they often present a tough image without actually addressing the core issue. Arguably the 'core issue' here is not one straightforward thing—but the rollback of both content and fact-checking moderation policies by major players like Meta certainly doesn't help. In fact, there arguably aren't any well-moderated online spaces for young people, with even the CEO of the extremely popular Roblox saying, "Don't let your kids be on Roblox." The very real risks posed by social media to children aren't going to go away simply because all the young'uns have been banned, instead likely only creating more cracks for young people to disappear down.

]]>
https://www.pcgamer.com/hardware/texas-senate-to-vote-on-bill-that-restricts-social-media-access-for-children-while-parental-consent-for-app-downloads-will-be-required-from-next-year/ eyJwArn7vEP8KSdXyGpuVK Thu, 29 May 2025 09:45:35 +0000
<![CDATA[ Razer releases AI plugin for game engines to assist with logging bugs and Quality Assurance testing in games ]]> Quality Assurance in games is one of the most underrated parts of game development. These departments usually employ people to go over games with a fine toothed comb making sure everything works correctly. But, due to the ever more complex nature of gaming, this can often be incredibly tedious work, even if you genuinely love the game. This could be an area of game development where AI might actually be helpful, without infringing on creativity. Well Razer certainly thinks so, releasing it's new AI powered gamdev tool with QA Companion.

The tool sounds super handy, acting as an automatic bug finder and logger. Especially given bug hunting has been one of the smarter uses for AI so far. Razer's new tool claims to free playtesters up to play the games, rather than having to stop and log every detail themselves. It also boasts the ability to fit into already existing workflows devs likely have, as it's available as a plugin for Unreal, Unity, and custom engines using C++. It even has custom settings for different genres and styles of games, and all of this can of course be customised to fit whatever the devs are actually working on.

Razer's QA companion isn't completely unique, as other companies are also coming up with similar sounding AI testing companions for logging bugs. One example is TestBot from Mighty Build and Test, though this one focusses more on having a bot play your game. Both could be really handy tools to augment the QA process, but it's important to remember that's all these are, tools.

For good quality assurance, you need quality testers. The human element is still incredibly important because there's information that can be vital that may not be included in these logs. There are also natural human inclinations that are more likely to be acted on and thus more valuable in testing.

I don't believe any of these tools should be about removing people from testing games, because all you're going to end up with is games that can be played by robots. While there are plenty of bots to worry about in gaming, I don't think they're supposed to be the target audience.

To understand being a QA tester in games, here's a little thought exercise. Imagine you're playing your favourite game and you walk through a door. Ok, now go back and do it again but this time do it on an odd second instead of an even one. Ok, again but this time you have a different item equipped. Again, but approach it from a different angle. Good, now keep doing this for eight hours a day. If you're starting to picture Dr Strange confronting Dormammu then you're starting to get the idea.

It's something I didn't fully appreciate before I spoke to Megan Summers, an Aussie gamedev with a background in QA, several years ago over on Byteside.

"So you know, you've got your ground foyer in, you've decided where you're going to put the electricity wires, you've decided where the plumbing is going to go in. That's all the design and technical people." Explained Summers, adding "And then QA opens every door 10 times to make sure that the door hinges work, but they're also checking that the paintings are straight and they've also got to go and sit on every toilet and flush it 50 times to make sure.”

This is what it can take to help make sure your game doesn't have any weird bugs that might drastically affect the player. And also why it's not uncommon for so many bugs to go unfound. Unfortunately, there are only so many hours in a day and not all developers have the budget to pour it into GameDev. Having a tool that takes some of the logging work away means you can put more time into finding new ways to open that door, as opposed to writing out the bug details.

Razer's AI QA Companion is currently in beta and will become available on the AWS Marketplace soon. For now you can get a look at it and even sign up to join in on the beta here.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/razer-releases-ai-plugin-for-game-engines-to-assist-with-logging-bugs-and-quality-assurance-testing-in-games/ aNk5xAA3xyuFXFPnj9PoZG Thu, 29 May 2025 06:37:07 +0000
<![CDATA[ Forget saving thousands for the newest iPhone. One hacker has turned his NES Zapper gun from Duck Hunt into a laser driven wireless phone ]]>

The tech inside Nintendo's Duck Hunt Zapper gun is easily some of the coolest to come out of 80's gaming. The way this device worked with the NES to determine whether or not you'd shot those poor innocent pixel ducks is truly some ingeniously innovative stuff. Unfortunately, with modern technology we don't really have a lot of space for something designed to work with CRT TVs.

The next logical step is to see what else we can do with these cool light reading guns. In a true homage to 80's nostalgia one enthusiast has turned the Zapper into a working telephone.

Novelty household phones were huge in the 80's and 90's. Once folks realised those plastic outer shells could be shaped like basically anything, well they took that as a challenge. According to Hackaday, Nick has taken that challenge to the next level by developing this wonderfully impractical Zapper phone.

Instead of reading a CRT screen for the perfect input of light patterns, Nick uses the light sensing capability of the Zapper to receive phone calls and transmit audio. It does this by reading the patterns in the laser light transmitted from the phones base station and translating these waves into audio.It's actually based off of the first phone invented back in 1880 which used light to transmit sound. What's old is new and old again.

He figured out the device could be used this way after doing a complete tear down on the Zapper, which you can also check out on his channel. In this he got a good understanding of how the gun worked, and also figured out how to bypass Nintendo's anticheat technology included in the Zapper.

By Nick's own admission, there is one real downside to the incredible Zapper Phone. It has to be pointed perfectly to receive the laser, which isn't very useful for a wireless device..

"It's actually quite bad as a phone." Says Nick, adding "The sound quality is quite good, but you have to have line of sight with a laser, and it has to be lined up almost perfectly." Fortunately, the tech here is so cool, I really don't care how useful the device is.

As these guns didn't come with microphones, for some reason, Nick has also added one for this project. The resulting setup is this kind of magical laser communication device that transmits audio between the base station and the gun-reciever. Also you get to talk into a NES Zapper gun, what more could you want?

You can get a better look at Nick's creation in the video above, including an explainer on how it works, and a demo video of the final result. As I've said, it's not exactly pretty, nor practical, but it is really freaking cool.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/forget-saving-thousands-for-the-newest-iphone-one-hacker-has-turned-his-nes-zapper-gun-from-duck-hunt-into-a-laser-driven-wireless-phone/ D2P9wbUfEe2EzsSJ6wPGq5 Thu, 29 May 2025 04:21:01 +0000
<![CDATA[ Corsair's new One i600 mini PC packs an RTX 5080 into a stunning understated wood panel case ]]> One of the best looking prebuilt compact PCs is back, with the new Corsair One i600 PC. This new mini-beast looks a lot like last year's Corsair One i500 PC with a beautiful toned down appearance, and hard hitting specs. And of course, a price tag to match.

The new Corsair One i600 follows in the series footsteps by hiding a GeForce RTX™ 5080 graphics card in its unassuming, elegant design. This is paired with an Intel® Core™ Ultra 9 processor that seems to be in almost every gaming machine available at the moment. These have both been assigned their own 240mm liquid coolers that they no doubt need in order to keep temperatures at acceptable levels in such a discrete case.

On top of that you've got 64GB DDR5 memory, which should be more than enough for any gaming you come across. Then there's 4TB NVMe SSD storage as default, which is really nice to see in a world where most machines baselines seem to be the 2TB drives. Games certainly aren't getting any smaller, so a good amount of solid state storage has never been more necessary for fast load times.

To further manage all this you've got the Corsair desktop software. This kind of bloatware on these bespoke PCs can be really annoying, but I think in this case something to help keep an eye on those temperatures and manage them is a good idea.

While the specs are impressive, it's really all about that sleek chassis. They're built to optimise airflow including fans on top of the liquid cooling to really try to keep those temperatures down and the running silent. Given how quiet the Corsair One i300 PC was, I have a fair bit of trust that the brand can deliver a hushed machine, even with those internals.

But what I love most about these machines is the look. These i600 PCs come in two different styles for the front panel: Metal Dark and Wood Dark. The black metal is nice, but the Wood Dark is definitely where it's at. Not only does it set these rigs apart, but it's actually real wood, to the point of being FSC-Certified. I don't know why this matters, but it looks great, and really subtle too. The perfect stealth gaming PC design, but don't worry, it also has RGB lighting.

But the price, as to be expected, is not so stealthy. For these fabulous looking PCs you'll be set back $5,000 USD. It's really not cheap, but it is gorgeous. It's also inline with previous prices of this line, so it doesn't look like tariffs have effected the cost of these too much. Still, might have to go out and find a nice piece of wood to stick on the front of my case so I can pretend affording one of these would ever be within my reach.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/corsairs-new-one-i600-mini-pc-packs-an-rtx-5080-into-a-stunning-understated-wood-panel-case/ 7hpUcW7JYLtm3dGBAvRYwA Thu, 29 May 2025 03:30:06 +0000
<![CDATA[ AMD's exciting new mainstream RX 9060 XT GPU spotted in online benchmarks as June 5 launch day fast approaches ]]> Will AMD's new Radeon RX 9060 XT be a saviour for gamers on a modest budget? That's the hope as the new GPU's June 5 launch day approaches. Of course, with the card so close to availability, you might expect a few leaks. And sure enough an RX 9060 XT has popped up in Geekbench, as spotted by X user Benchleaks (the clue is in the name!).

Geekbench obviously isn't our favourite metric for a GPU. An actual game would be far better. A benchmark that uses something like a 3D game engine would be next best. But Geekbench is what we have, so what does it tell us?

Well, the RX 9060 XT notches up 109,315 points in OpenCL. For context, the official Geekbench results list puts an Nvidia RTX 5060 at over 120,000 points and the last-gen RX 6600 XT at a little over 80,000 points.

In the Vulkan test, the 9060 XT notches up 124,251 points. That's more competitive with the RTX 5060, which is listed at a little under 120,000 points.

Of course, AMD is claiming the RX 9060 XT is actually faster than the RTX 5060 Ti, let alone the plain RTX 5060. So, these Geekbench numbers look a little low in that context.

It's worth noting that these are not official results. They're not definitely representing a final retail RX 9060 XT or really any 9060 XT at all. Likewise, whatever drivers were being used likely aren't final.

In a way, the RX 9060 XT isn't much of a mystery in any case. It's exactly half an RX 9070 XT in terms of the hardware, including GPU cores, memory bus width, the works. However, with a boost clock of around 3.1 GHz, the RX 9060 XT runs slightly faster than the RX 9070 XT, which tops out at 2.97 GHz, officially.

Of course, the big appeal with the RX 9060 XT is price. The base 8 GB model is MSRP'ed at $299, with the 16 GB listed at $349. That's in line with the RTX 5060, which is also $299 and very attractive compared to the 16 GB version of the RTX 5060 Ti, which is listed at $429.

Of course, these are all MSRPs and real-world prices can vary. A lot! As with all GPUs these days, the appeal of the RX 9060 XT will likely hinge on retail pricing, the realities of which we'll know in just over a week. Watch this space.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/graphics-cards/amds-exciting-new-mainstream-rx-9060-xt-gpu-spotted-in-online-benchmarks-as-june-5-launch-day-fast-approaches/ PZdAgUV33P5pGfZx9A3vFo Wed, 28 May 2025 16:33:21 +0000
<![CDATA[ Turtle Beach Stealth 700 Gen 3 review ]]> I've always admired Turtle Beach's Stealth 700 range. The cheaper headsets, like the Turtle Beach Recon 70, don't quite hit the right price-to-value ratio for me, and though the Turtle Beach Sealth 500 seems like a solid offering, I've always liked the more heavy-duty feel of the pricier gaming headsets. They have thick cushioning around the ears, a close but comfortable fit, and have a striking sound quality to go along with it.

At this point, I've tested every generation of Stealth 700, and the latest is clearly the best, but the broader market has changed a lot recently, too. One of the biggest problems with buying a gaming headset right now is that it feels like there's a bit of a solved science to many price brackets. You got about $60? Go for the Corsair HS55 Stereo. Is your budget $200? The HyperX Cloud Alpha Wireless is the obvious choice.

The Turtle Beach Stealth 700 Gen 3 will have a tough time in its market, mostly because there are so many solid options already available. The question is no longer "is it good?" but "is it better?" The answer, I'd say, is "not quite, but it puts up a good fight".

The Stealth 700 Gen 3 I'm testing right now is the PC version, which comes in a sleek black colourway with light silver detailing—but I wouldn't exactly call it stealthy. From the thick cushion, fairly heavy size, and Turtle Beach name slapped on the side, it is noticeably a gaming headset. That's not necessarily a bad thing, but it's perhaps a little too gamery on the go, if such a thing is possible.

Turtle Beach Stealth 700 Gen 3 specs

Turtle Beach Stealth 700 Gen 3

(Image credit: Future)

Connection: 2.4 GHz wireless, Bluetooth 5.2
Type: Closed back
Frequency response: 20 Hz - 20 kHz (up to 10 Hz - 40 kHz on PC)
Drivers: 60 mm Eclipse dual drivers
Microphone: Flip to mute mic
Features: Dual USB connectors
Weight: 408 g
Battery life: 80 hours
Price: $200 | £180

The PC version isn't just the go-to for PC gamers because of its colours. It comes with unique features in Turtle Beach Swarm II, the audio software, like game chat boost, game chat mix, and Waves 3D, the 3D audio option. Unfortunately, the colour of the headset defines the platform, so if you purchased the Cobalt Blue model because you like the look, you are missing out on the 10 Hz to 40 kHz frequency response that is only possible on the PC headset.

It's worth noting that the PC version of the headset also works on PS4 and PS5, though not on Xbox. If you happen to also play on PlayStation, I don't see a single reason to buy the PlayStation-exclusive white headset instead.

The headset also comes with two 2.4 GHz connectors, as well as built-in Bluetooth 5.2 support. This means you can plug one receiver into a PC and one into a PS5, and swap from device to device by simply clicking a button on the headset. Then, Bluetooth connectivity is simultaneous, which means you can watch a quick TikTok from your phone without having to take the headset off. It's super handy and one of the best features of the Stealth 700 Gen 3.

Unfortunately, though the ability to connect to multiple devices at once is superb, it can be a bit more effort than I otherwise expected. Updating the headset, letting it go to sleep through inactivity, or swapping devices can take a moment, and it occasionally fails to connect on the first go. This means light troubleshooting is a pervasive issue with the headset. Swapping devices never meant I had to commit to a full-on reset, but if the headset went into sleep mode, I'd occasionally have to reconnect again manually. On top of that, there's no wired connection option with the headset.

The Stealth 700 Gen 3 is rather impressive when connected, though. It has a neutral sound profile with a light boost in bass, which means it's atmospheric for games and head bob worthy for music. Testing it in Runescape: Dragonwilds offered a real thump to the sound of a cow hitting the ground that almost made me feel bad. Almost. Then, the sounds of Dragonwilds, filled with birds singing and magic in the air, came through cleanly and clearly.

The explosive sounds of Doom: The Dark Ages are as punchy and weighty as you might expect, but the drivers are able to enunciate the parts in the soundtrack that aren't thumping and shrieking over the top of the mix. It allows nuance to slip under all that distorted guitar.

Controls are built into the side of the earcups and are very easy to navigate, which I was thankful to learn after being shocked by the pounding music in the Counter-Strike 2 lobby theme. Counter-Strike 2 itself is an appropriate test of the headset, as very bass-heavy gaming headphones sometimes drown out footsteps and reload sounds amid the thunder of an AWP shot. Despite a bassy sound, the Stealth 700 Gen 3 performs admirably and clearly.

There's also plenty of customization in Turtle Beach Swarm II, with simple EQ controls, and toggles for Waves 3D, a boost in game chat, and mic monitoring. I did find that above 80%, mic monitoring would result in a pretty nasty buzz. This buzz wasn't projected while speaking, so 65% is the sweet spot here.

Listen to the microphone test here:

Swarm II is both a quality bit of software with intuitive controls and easy access to future updates, but a pain to continue using. After a little while, the software just refused to open, requiring a fresh install or restarting my entire rig just to turn on. On one PC, I had no problem with the software. On another, it totally failed when trying to get an update and required tonnes of testing to get working again. Broadly speaking, software generally can be finicky, so there's always a chance this is partially my rig's fault, but it feels worth noting here regardless.

The Stealth 700 Gen 3 has a nice snug fit, with a pretty secure clamping force on the head. It never caused me any pain or discomfort, but it is a stronger squeeze than I'm used to. Luckily, the top headband and side earcups are super cushioned, so the whole headset ends up just feeling cosy. I rather like the fit, but it is bulky and tight, so it may annoy others.

Buy if...

✅ You want to connect to multiple devices at once: Most headsets can do this now, but the Stealth 700 Gen 3 stands out by being able to connect to a console and PC via two different connectors, alongside a Bluetooth connection.

✅ You like a tight clamping force: I've always liked the admittedly snug fit of the Stealth 700 line, and it's one of my favourite headsets out there for comfort. This could be a sensory nightmare if you don't like to really feel your gaming headset as you play.

Don't buy if...

❌ You like the idea of the HyperX Cloud Alpha Wireless: The Stealth 700 Gen 3 shines in some areas but we are absolutely smitten with the similar priced Alpha Wireless, and our money would probably be going there instead.

The mic is okay for its price range. It's clear, with the software giving plenty of customization options. It can come across a tad tinny, even with AI noise filtering turned off. The flip-to-mute system also works perfectly, thanks to a small chime indicating every time you are muted. Just pop the mic down, and you are ready to get talking to your friends.

One of the biggest sins the Stealth 700 Gen 3 commits is being the same price as the MSRP of the Cloud Alpha Wireless. Given that the latter headset often comes in cheaper as it's older. The comparison isn't the most flattering for the ol' Turtle Beach. The Stealth 700 Gen 3 has a notable battery life of 80 hours, but this pales in comparison to the 300 hours the Alpha Wireless will give you. As well as that, the older headset sounds great, has no software issues, and doesn't require an acquired taste for the close fit.

I have tried every Stealth 700 headset to date, and the Gen 3 is my favourite of the lot. However, that MSRP is high enough for the competition to be quite tight, and it's not nearly as easy to use as I had hoped. A comfy fit and great drivers are let down by mediocre software, and impressive yet somewhat inconsistent pairing.

]]>
https://www.pcgamer.com/hardware/gaming-headsets/turtle-beach-stealth-700-gen-3-review/ DGj7sbq6uNSVrkcCBvSgyC Wed, 28 May 2025 16:30:44 +0000
<![CDATA[ Listening to Google's CEO talking about what about the future of AI holds just reinforces the fact that nobody can know what the future of AI holds ]]>

Every other week we seem to get a tech CEO or some other bigwig telling us how great AI is and how revolutionary the AI era is going to be. But whenever they start talking about what exactly that will look like, I'm reminded of the frightening strength of historical variability and humanity's incapability of predicting it. Case and point, Google CEO Sundar Pichai's recent interview with The Verge.

The part of the interview that stuck out to me the most was when Pichai responded to a question from The Verge referencing the possibility that the web "will turn into a series of databases for various agents to go and ask questions to" rather than something we actually interact with.

The CEO's response is, initially, to remind us that, "the web is a series of databases, etc. We build a UI on top of it for all of us to conceive."

And no, this isn't another 'the internet is a series of tubes' moment. Think about this for more than half a second and it seems obvious: The high-level interactions that we have in any software is always a veil over the low-level machinations rolling forward underneath. But it's interesting to be reminded of this fact in the context of a supposedly new phase, paradigm, or stage of computing and the internet.

Being reminded of this really brings into focus the genuinely unimaginable extent of how things might change in the upcoming AI era, should it come to pass and the bubble not burst. The entirety of the web and applications in general, as we experience them, has been erected upon the premise that it's human beings using them.

Without that premise, there's essentially a blank canvas on top of the underlying database of interconnected information. The problem then comes in when we try to understand how the picture will develop on top of that canvas.

We've only ever understood the internet as human-usable, so it's difficult to even get a grasp on the possibilities that could be open to us with an AI-driven web, let alone what might actually form atop that canvas.

This is arguably what allows CEOs such as OpenAI's Sam Altman to claim that AI will be "as big as the internet, maybe bigger". It's a sentiment that Google's Pichai shares: "I think AI is going to be bigger than the internet. There are going to be companies, products, and categories created that we aren’t aware of today."

AI, explained

OpenAI logo displayed on a phone screen and ChatGPT website displayed on a laptop screen are seen in this illustration photo taken in Krakow, Poland on December 5, 2022.

(Image credit: Jakub Porzycki/NurPhoto via Getty Images)

What is artificial general intelligence?: We dive into the lingo of AI and what the terms actually mean.

In his interview, Pichai—although not immune to seemingly increasingly obligatory tech CEO speculation—does seem to acknowledge this hidden horizon. He says that AI, a "horizontal piece of technology", can lead to impacts and returns on investment that "may not always be immediately obvious."

He also says that he thinks we're "going to see a new wave of things, just like mobile did. Just like the internet did."

And that's as good of a reminder as any, right? Did anyone really predict the way the internet would pan out, or the way the smartphone would? Part of the reason these shifts were so pivotal is because they were so un-predictable.

So, as admittedly yawn-inducing as all these "the next internet!" pronouncements are, these CEOs might just have a point: much like the internet, no one really knows what AI will morph into, even if they're the ones leading it.

]]>
https://www.pcgamer.com/software/ai/listening-to-googles-ceo-talking-what-about-the-future-of-ai-holds-just-reinforces-that-nobody-can-know-what-the-future-of-ai-holds/ mNet5Qm3YC8PdCaEGNA2J6 Wed, 28 May 2025 15:45:18 +0000
<![CDATA[ Computex 2025 showed that innovation in PC hardware often involves sticking a screen on something but don't let that distract you from the genuine improvements underneath ]]> Last year, I reviewed a new liquid cooler with a screen on it. It's called the Hyte Thicc Q60, and it comes with a 5-inch screen powered by an ARM chip and running Android. I had two main thoughts at the time: 'this is a novel idea' and 'this is absolutely ridiculous'. Little did I know that a year later, I'd be walking around the showfloor at Taiwan's top tech show, Computex 2025, and the majority of the companies I would visit would be a liquid cooler with a screen plonked on the top.

Hyte wasn't the first to stick a screen on a liquid cooler, but it certainly went to great lengths to stand out from the crowd. At Computex, I saw companies going to ever more extreme lengths for much the same reasons. Over at Lian Li, it showed off a prototype for a liquid cooler that not only features a screen but said screen is motorised, allowing for a user to move it up, down, left, and right with the click of a button. Next to it, a liquid cooler with a dial to control some of its functionality.

Then there's the unit over at Xigmatek's booth. Not to be one-upped by anyone, Xigmatek thought it wise to plonk a 7-inch screen on its liquid cooler pump unit. Connected via magnets, imagine my surprise when the rep at the booth cuts the video playing on the unit and uses it to browse the Windows desktop as a second monitor instead.

I like absurd technology and weird PC builds, let me make that clear, but I was left wanting for some real technological advancement from the showfloor. I get that screens are popular, and they do stand out, which is advantageous when you're fighting for attention with the next booth, but where's the innovation in performance, thermals, and acoustics? What is going to make a real difference to my next gaming PC build?

Turns out, there's a lot to be excited about headed to our gaming PCs. I just needed to dive a little deeper.

Since we're on the subject, the latest development in liquid cooling from Asetek is one such innovation. The company manufactures liquid coolers for many brands, including Fractal, MSI, Asus, Phanteks, and more, and its latest cooler design is called Ingrid. Ingrid is promised to be the company's quietest yet, featuring a new pump design that increases tolerances for, what I'm told by Asetek R&D head, Thomas Ditlev, is operation "on a level where you can't hear it."

Ingrid is aimed at workstation users as much as open to gamers, but it will be headed to gaming PCs in the near future. Antec had an Ingrid-powered liquid cooler on display at the show, though admittedly only a prototype, and Asetek says it has others signed on and interested in using it. I did try to test out the Antec Vortex View—this prototype liquid cooler has a screen on it, just in case you forgot my previous point—but the hubbub of the showfloor made any acoustic testing nigh impossible. I couldn't hear it even with my ear up close, for the record.

"When it's installed, you can always hear the fans, you can't do much about them," Ditlev says. "But with good thermal performance and really low noise pumps, and then you can dial down the fans, and get it close to noiseless."

That is a genuine improvement that will make a difference to our gaming PCs and how they function, especially if you're using an older liquid cooler with one of the frightfully loud pumps at full whirr.

With good thermal performance and really low noise pumps, and then you can dial down the fans, and get it close to noiseless.

Thomas Ditlev, Asetek

There's another new product headed our way with aspirations much the same as Ingrid. It's Noctua's new liquid cooler, or rather its first-ever liquid cooler. It's a big step for the famous brown and beige cooling brand, and that's why it's using Asetek's older, performance-focused Gen8 platform under the hood. But that's not the exciting bit. No, Noctua says it has managed to make the louder Asetek pump much, much quieter (around a 5.7 dB(A) average noise reduction) using a combination of 3-layer soundproofing and a tuned-mass damper.

Tuned-mass dampers, as I'm sure you're aware, are used in large skyscrapers to counter movement, ie earthquakes, such as the one found in Taipei's famous 101 building, which sits just down the road from the halls of Computex. That's a convenient comparison for Noctua and one that it takes full advantage of when explaining the concept to me at its booth.

Fun fact: the Taipei 101 has its own mascot designed by Sanrio, owner of Hello Kitty, and it's a tuned-mass damper with arms and legs called the Damper Baby.

I could do with my own tuned-mass damper to get me back on point. Ahem, the main takeaway is that, even in liquid cooling, there's a lot more going on that's set to improve our PCs in material ways and beyond big screens and bright lights. Noctua also had its thermosiphon liquid cooler prototype on show at Computex 2025, a novel design for a chip chiller that doesn't use any sort of pump. Science keeps this cooler functioning, and that it does—Noctua had it operational at its booth on a CPU playing one of the F1 games.

A photo of Noctua's thermosiphon CPU cooler project, as displayed at Computex 2025

(Image credit: Future)

And it's not just liquid cooling seeing real innovation.

From perusing the many new cases on show at Computex, I can say that Corsair, Havn, Geometric Future, and Lian Li are all set to shake up airflow optimisation, striving for higher airflow and lower temperatures.

Havn has carved out a scoop to collect the air from its new and enormous 180 mm fans in the front of its new BF 360 case. This design has gone through testing to ensure it's up there with the best PC cases around, and Havn would like to believe it's actually better. We'll have to see about that when we get one in ourselves for testing. Lian Li has adopted a vent on the side of its Vector V200 case, which helps pull in air from the side panel up through fans mounted in the bottom of the case, aiding a chimney-style airflow.

Corsair had the triple-chamber Air 5400 on display at Computex—a close second for my top case of the show. This features Venturi effect fan shrouds and a small chamber to the front-right of the case, which is designed to fit a 360 mm radiator. Behind that is a scoop—again with the scoops—to collect the hot air from the rear of the radiator and disperse it out the side of the PC case and away from any of the components inside the other two chambers.

It's a smart design, but Corsair is not alone in coming up with the concept. Geometric Future's massive Model 9 also has a separate chamber for the radiator to keep the rest of the components out of the firing line. It's not quite as small or sleek as Corsair's Air 5400, but the concept is clear: get the hot air from your radiator as far away from your thermally-sensitive components as possible.

A slightly different approach to case cooling, adopted by Tryx, is a specialised cross-flow fan. This pulls in air from the side of the case, through a vertically-mounted intake running the length of the chassis, and it's used in this instance to bypass the fabric overlaid on the front and side panels. It's intended to be used alongside the front fans, helping to maintain airflow that might otherwise be somewhat restricted, but I'm keen to see how well this solution works. Tryx also had a liquid cooler, an air cooler, and a case with a screen on it at the show—who'd have thought?

Another company not afraid of sticking a screen on a liquid cooler is Cooler Master, though when I visited their HQ, there was something more exciting for fans of air cooling like myself. It's called 3DHP, for 3D Heat Pipe, and it's a slightly confusing name given to the fact that heatpipes are already 3D objects, but that's never stopped a company branding exercise before. The underlying tech is essentially the addition of an extra heatpipe to the existing U-shape.

Strip an air cooler back to the basics, and you end up with a U-shaped heatpipe. The lower, shorter edge runs through the coldplate, which comes into contact with the CPU, and the longer, upright edges on either side are where the heatsink is mounted. If you look at this U-shape and imagine turning it into a trident with a line down the middle, that's what 3DHP is. It's an extra heatpipe.

The benefit of this extra heatpipe, Cooler Master tells me, is that it allows the centre of a heatsink to be more effectively utilised in dispersing heat generated by the chip.

This is the new era.

Gunnar Schreck, Cherry

That's a new technology rolling out with Cooler Master's latest air coolers, which just goes to show there's still life in that old, air-cooled dog yet. As a fan of air coolers, it's good to see some progress on the air cooling front, though I'm still doubtful it'll lead to any major change in perception for air coolers—you certainly want liquid for a properly high-end gaming CPU, especially if overclocked.

A Cherry IK key switch on display at Computex 2025 within a test keyboard pad.

(Image credit: Future)

Now for something completely different from my chat with Cherry, makers of mechanical key switches. Or should I say makers of key switches? Its latest lot are far from mechanical. These are its IK switches, and they are built using induction technology. We've seen induction before on the Ducky One X, though these are totally different from the ones used there. Cherry is extremely excited about IK's prospects in the market for a few reasons: I'm told they're cheaper to manufacture than mechanical switches, more reliable, and less power hungry than Hall effect—though Cherry also has new Hall effect switches.

"We at Cherry say, after 30 years of MX, this is the old era. Our combustion engine. This [IK] is the new era," says Gunnar Schreck, Cherry global product manager.

Computex 2025

The Taipei 101 building and Taipei skyline in Taiwan.

(Image credit: Jacob Ridley)

Catch up with Computex 2025: We're stalking the halls of Taiwan's biggest tech show once again to see what Nvidia, AMD, Intel, Asus, Gigabyte, MSI and more have to offer.

Now that's a serious shake-up to the market, and yet not all that well signposted over at Computex. In fact, Cherry had one test board at the back of its booth to showcase the new technology, which it seems confident will be as ubiquitous as mechanical switches in the near future. Just another potentially huge development in PC hardware that doesn't get quite as much screen time as flashier components with lots of lights.

Clearly there's a lot to be excited about in PC hardware in the near future, outside of the parade of new processors and new graphics cards. Cherry's IK switches are coming later this year, Cooler Master's 3DHP coolers are already on the way, Noctua's first liquid cooler is set to arrive by 2026, and Asetek's new Ingrid platform is likely to reach the market around that same time. All promising genuine improvements in ways that matter to PC gamers… though, okay, I admit I do want the motorised liquid cooler screen.

]]>
https://www.pcgamer.com/hardware/computex-2025-showed-that-innovation-in-pc-hardware-often-involves-sticking-a-screen-on-something-but-dont-let-that-distract-you-from-the-genuine-improvements-underneath/ WTvzVYBjTPxTUpq6qsMi8e Wed, 28 May 2025 14:41:50 +0000
<![CDATA[ Windows pack-in classic Space Cadet Pinball has been unofficially ported to Android devices—and it's free ]]> What came before the iPad baby? Well, with both parents working various IT systems jobs throughout my youth, in my case I was always destined to become a desktop gremlin. Hastening that descent were a number of games, but the accessibility of 3D Pinball for Windows – Space Cadet was the one that probably annoyed my folks the most. First packaged alongside Windows 95, there's no need to dig out an old machine for a nostalgic game of pinball.

Space Cadet Pinball is now available for free on the Google Play Android app store (via PCWorld). It's very much still the game you remember, completely free of modern mobile dreadfulness such as microtransactions or in-app ads. The Android port's developer, Kyle Sylvestre, has added online leaderboards so you can compete on high scores, though otherwise an online connection isn't even necessary to play.

However, as this is a port for your smartphone, you are stuck tapping an on-screen control overlay. 'Touch screens were a mistake' aside, it's also worth underlining this isn't an official port by Microsoft (or indeed the long since defunct Cinematronics), but a project by a lone nostalgic developer based on a decompilation of the original game by k4zmu2a on GitHub.

Judging by Sylvestre's Reddit post history, the developer is also working on an iOS port. Sylvestre posted about their port on the Android subreddit too but the post has since been removed by mods. Far be it for me to judge what anyone has on their smartphone, especially when official social media apps are ever hungry for one's data, but it never hurts to have context before you tap 'install'.

Space Cadet Pinball made its debut as part of the Microsoft Plus! enhancement pack for Windows 95, but its inclusion began to be phased out around 2001. According to long-term Microsoft employee Raymond Chen, the version initially intended for the 64-bit edition of Windows XP ran into a game breaking collision bug, and so was removed. Chen offers a little more context about the various quirks of Windows development that lead to this bug in his follow-up blog post here.

Space Cadet Pinball did eventually return for the Windows XP Professional x64 Edition in 2005, with comparatively minor visual glitches. It's been MIA from OS releases ever since, and obviously you won't find it in Windows 11, or the soon to be no longer supported Windows 10. Chen sheds light on this too, explaining that attempts to revive the game as a Microsoft Garage project were scuppered due to the original licensing agreement.

Apparently, Space Cadet Pinball was agreed to only ever be released alongside successor Windows OS products after 95, or as part of an enhancement pack like the aforementioned Microsoft Plus. To get more specific, the original agreement prevents Microsoft from putting Space Cadet Pinball out as a standalone release, or from making the game's source code publicly available—hence k4zmu2a's decompilation and Kyle Sylvestre's very much unofficial Android port.

There are ways to play it on PC today, however. Google the name and you'll find a few. Players have been keen to get back to hitting space balls for an age—here's a PC Gamer story on how to play the game from seven years ago. No idea if that one still works, mind.


Best gaming PC: The top pre-built machines.
Best gaming laptop: Great devices for mobile gaming.

]]>
https://www.pcgamer.com/hardware/windows-pack-in-classic-space-cadet-pinball-has-been-unofficially-ported-to-android-devices-and-its-free/ pdRHTMivs4zHAggekHfiKD Wed, 28 May 2025 14:36:56 +0000
<![CDATA[ Elden Ring Nightreign PC performance analysis: A bare-bones console port with glitches a-plenty ]]> It's fair to say that Elden Ring Nightreign appeared somewhat out of the blue, as it was only announced last December. Perhaps even more surprising are its co-op, rogue-like gameplay and procedurally generated worlds, aspects that most (if not all) of FromSoftware's games eschew.

But that's what we've got, so if the thought of an Elden Ring 'Lite' appeals, you'll want to know how well it runs, yes? Especially given that the original Elden Ring was capped to 60 fps and on some gaming PCs, it was cursed with an ever-present stutter. Well, Elden Ring Nightreign is no better, I'm afraid.

It's nowhere near as bad, thankfully, but regardless of what gaming PC you have, you'll still experience a hiccup or minor pause in the frame rate every now and then. It mostly happens while traversing the game's world, but it also crops up in hectic battles—precisely where you don't want it.

How much stuttering you'll see is going to be very dependent on what hardware you have in your gaming PC. Nightreign's system requirements are very light, but there's a devil or two in those details, which we'll look at shortly.

(Image credit: Bandai Namco Entertainment)

Nightreign also includes Elden Ring's 60 fps cap, with pre-rendered cutscenes coming in at 30 fps. At some point, I have no doubt that modders will knock something up to get around this limit, but for the moment, that's as good as it's going to get.

Due to the frame rate limit, there was little point in doing a normal PC performance analysis (i.e. testing all the quality presets across three resolutions), so for Nightreign, I've simply run the game on a range of PCs and captured how well the game runs. It's not a comprehensive spread of configurations, but it should give you a reasonable idea as to how well Elden Ring Nightreign will run on your gaming PC.

Tested on: Ryzen 7 9800X3D | GeForce RTX 5090 | 32 GB DDR5-6400

4K | Maximum quality


Let's start with Elden Ring Nightreign at its best. I don't mean that in a graphical sense because the game's visuals are nowhere near as good as Elden Ring's, and subjectively, it's more akin to a game from 10 years ago. When I say 'best' in this case, I mean as glitch-free as possible.

The above capture was taken with Nightreign set to 4K and maximum graphics settings, and you can see that, for the most part, everything is very smooth. Of course, it should be on such a gaming PC, but I have to say that even this rig suffered from frame rate hiccups now and then, especially as one traverses the world and more so when you've got a full complement of teammates with you.

As you can tell from its power consumption, the GeForce RTX 5090 is effectively doing very little, pushing the game's performance almost entirely onto the CPU and the rest of the system. This means that any minor background issue will be amplified and cause the frame rate to momentarily drop.

Unfortunately, there are no options to replace Nightreign's default anti-aliasing (AA) solution with an upscaler running at 100% or higher, such as AMD's FSR Native AA or Nvidia's DLAA, to give the GPU more to do.

However, I did test out Nvidia's DLDSR (Deep Learning Dynamic Super Resolution) with the RTX 5090 in a Core Ultra 7 265K rig, with 48 GB of DDR5-8000, and as the above video shows, it does look better. But even with a setting of 2.25x (i.e forcing the internal rendering resolution to 5760 x 3840), the GPU still isn't running anywhere near its full capacity.

Tested on: Core i5 13600K | GeForce RTX 4070 | 32 GB DDR5-6400

1440p | Maximum quality


It's a similar story when using a lower-tier graphics card that's only two years old, a GeForce RTX 4070. Elden Ring Nightreign has no problem reaching 60 fps with this setup, although 4K is a little too much for the 4070 when using maximum quality settings. It's not the type of graphics routines taking place that's the problem, just the sheer number of pixels.

So while the above video is 4K, the gameplay was captured at 1440p. Once again, the RTX 4070 isn't doing much, which suggests that 4K should be doable, but the GPU just couldn't sustain 60 fps at that resolution.

Also note that, compared to the Ryzen 7 9800X3D rig, stuttering is more prevalent, even when not traversing the world. Just moving the camera around rapidly during combat induces frame rate drops. Sure, the Core i5 13600K isn't a high-end CPU, but it's not slow for gaming, either.

Tested on: Ryzen 7 5700X3D | Radeon RX 6750 XT | 32 GB DDR4-3200

1440p | Medium quality


Now let's take a look at what happens when you use older hardware and, more importantly, an AMD graphics card. The first problem I encountered with this rig is that Elden Ring Nightreign was capped to 30 fps, not 60. There are no options to adjust vsync or anything like that in the game, so I spent a good deal of time messing about in AMD's Adrenalin software to resolve this problem.

The only thing that worked was to force vsync to always be off, but as the video amply demonstrates, you're then just left with some of the worst tearing and timing glitches I've seen in a very long time. At least I could consistently reach 60 fps with the Radeon RX 6750 XT, though only by using 1440p and the Medium quality preset—1440p High is also playable, but the frame rate just isn't as steady as with Medium.

Could you cope playing for hours on end with the game tearing away like that? I know I couldn't, and it's not just the screen tearing. At Medium quality, the level of detail (LOD) transitions are very noticeable, especially if you can see far into the distance as you move along. Vegetation and shadows pop into view very starkly.

I also tested Nightreign using a Radeon RX 7900 XT in this rig, and it also had the same 30 fps problem. Forcing vsync off in Adrenalin once again solved matters, and I could get the full 60 fps at 4K High or 1440p Maximum quality. But I couldn't get rid of the tearing, and I suspect it will require a driver update from AMD to fully resolve it all.

Tested on: Core i7 9700K | Radeon RX 5700 XT | 16 GB DDR4-3200

1080p | High quality


Interestingly, though, not every AMD GPU I tested had the 30 fps issue. Using the oldest gaming PC I have—a Core i7 9700K setup from 2018, with a Radeon RX 5700 XT—I was surprised to see that this ancient box ran Nightreign at 1080p High quality with few problems.

Yes, stuttering was present (especially when moving the camera about very quickly), but it was far less of a problem than I expected it to be. However, trying 1440p on Medium or 1080p on Maximum quality tanked the frame rate hard, more than it should do, really. While Nightreign doesn't need upscaling for newer gaming PCs, old rigs like this one would really benefit from being able to use FSR Quality or Balanced to reduce the pixel load.

However, given that FromSoftware's PC port of Elden Ring Nightreign is as bare-bones as possible, it's unlikely that it will be added in a future patch. In fact, there are so few PC-related graphics options that I'm not hopeful of seeing any patches addressing the stuttering and tearing.

Tested on: Asus ROG Ally | 15 W mode

1080p | Low quality


The last gaming PC I tested Elden Ring Nightreign on was an Asus ROG Ally, with its power limit set to 15 W. The reason why I prefer to use this value, rather than the maximum 30 W, is that you get far more battery life for handheld gaming when the lower power value is used.

Just as with the RX 5700 XT, the 30 fps problem didn't rear its head with this device, though getting the performance anywhere near 60 fps proved impossible. The capture you can see above is at 1080p Low quality, and while the ROG Ally often reaches 40 frames per second or more, the 1% lows are only around 25 fps.

And, truth be told, Nightreign doesn't look very nice using the lowest graphics settings, with barely any AA being applied and shadows glitching across the landscape. Using the Medium preset improves things greatly, but then you have to put up with sub-30 fps performance. It's better when using the Ally's full 30 W mode, but then you're not going to be gaming for very long.

One alternative I explored was Radeon Super Resolution (RSR), a driver-enforced upscaler that works just like FSR 1.0 does. While it allowed me to use the Medium preset and just about hit 30 fps, there was just too much input latency for it to be enjoyable. So, if you are planning on some late-night Nightreign sessions on your handheld gaming PC, you'll need to stick to the Low preset and unpleasant graphics.

If your handheld is a Steam Deck, though, you might want to pass on Nightreign altogether because if a ROG Ally struggles, the Deck's weaker CPU and GPU are unlikely to cope at any setting.

Final thoughts

(Image credit: Bandai Namco Entertainment)

In addition to the above test PCs, I checked out Elden Ring Nightreign on a Core i7 14700K, Ryzen 7 7700X, and Ryzen 5 5600X, with the above GPUs, plus a GeForce RTX 2060 and RTX 3060 Ti.

Generally speaking, none of the GPUs had any problem reaching the frame rate cap, though it does require a little bit of experimenting with resolutions and quality settings. All you really need to do, though, is fire up Nightreign at your preferred resolution and start with the High quality preset—if you need more fps, then just drop it to Medium (avoid Low, if you can).

What I can say with certainty, though, is that if you have an Nvidia GPU, the game will run as glitch-free as it can, but if your gaming rig is home to an AMD GPU, I suspect that it's going to pot luck as to whether you'll experience the same problems that I did.

The newer the hardware, the better everything will be, though that's to be expected, of course. One also expects a FromSoftware PC port to be somewhat problematic, but I have to say that Nightreign is pretty disappointing in terms of PC options and performance. It's not the frame rate cap I have issues with—any game that runs consistently at 60 fps should be an enjoyable experience, but Nightreign struggles to do this far too frequently.

Behold Nightreign's vast array of PC graphics options. (Image credit: Bandai Namco Entertainment)

A Ryzen 7 9800X3D and RTX 5090 combination should be permanently 'stuck' at 60 fps in Nightreign, but it often isn't, and there's no obvious reason as to why it's not. It's perhaps understandable when playing online with friends, but offline and solo? That kind of hardware shouldn't be dropping in performance whatsoever.

If Elden Ring Nightreign was toting path-traced, global illumination splendour with every pixel, the performance would be understandable, forgivable even. However, that's absolutely not the case here, and while Nightreign's stuttering is nowhere near as bad as it is in Elden Ring, I am left wondering just why FromSoftware still hasn't solved the issue.

And on the point of graphics, Nightreign's are serviceable, at best. The overall art design isn't bad at all—some of the vistas are genuinely lovely to see—but the anti-aliasing, shadows, and texture quality are disappointing when using anything less than the maximum quality settings. For a 2025 PC game, it's really not good enough.

At least Nightreign runs on old hardware, and it won't take up too much of your PC's storage, with the entire installation requiring just under 21 GB of space. And gameplay beats graphics, every time, so if you don't mind the looks, minimal PC options, and the ever-present penchant for stutters, Nightreign might just tickle your multiplayer Elden Ring bone.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/elden-ring-nightreign-pc-performance-analysis-a-bare-bones-console-port-with-glitches-a-plenty/ 7dbff2xVYitBZUxekrZrsL Wed, 28 May 2025 14:00:00 +0000
<![CDATA[ Samsung's new Peltier cooling tech is more efficient and less wasteful and could be a great for keeping a power-hungry CPU chilly ]]> New cooling technology is one of the more exciting things we can hear about in the world of PC hardware in my opinion. Not only can it apply to so many different components—though chiefly the CPU—but also different brands and generations. So colour me icy blue with anticipation over the latest news out of Samsung Towers.

According to Samsung's summary of a very technical paper, in collaboration with John Hopkins Applied Physics Laboratory (APL), the company has made some significant strides in the application of the Peltier cooling technique.

Apparently, "the new manufacturing process not only drastically reduced the amount of Peltier materials required down to about 1/1,000 of the material typically required, but also simplified the production steps. This advancement enhanced scalability and enabled mass production, with promising prospects for significant gains in both cost-effectiveness and environmental impact."

Peltier cooling makes use of the Peltier effect to regulate temperature, allowing for not just sub-ambient (lower than room temperature) temperatures but also precise control of that temperature. This is achieved without the use of refrigerants which are usually required for sub-ambient temperatures.

This kind of thermoelectric cooling makes use of the fact that when two different thermoconductive materials are joined together, a current passing through them in a kind of circuit generates heat at one of the junctions and absorbs heat at the other.

A Samsung and Johns Hopkins APL Peltier cooling device

(Image credit: Nature Communications, Samsung, Johns Hopkins University)

If you put a bunch of the absorption-side junctions next to whatever it is you want to cool down, it should absorb the heat produced by this thing and transport it over to the heat-generating side of the junction. This excess heat can then be carried away by a more traditional method such as a liquid loop.

"Compared to traditional vapor compression methods, Peltier cooling enables fast and precise temperature control with a simpler configuration, making it applicable to various industrial fields, including home appliances, semiconductors, medical devices, automotive electronics and data centers," says Samsung.

We've seen Peltier cooling specifically applied to CPUs before with Intel's collaboration with EK for the EK-QuantumX Delta TEC water block. It isn't flawless, though, as it consumes a lot of energy, pumps out a lot of heat that requires a copper cooling loop to handle, and costs quite a lot.

Condensation is always a problem with thermoelectric cooling, too, as it is with any cooling that achieves sub-ambient temperatures, with phase change cooling being the most obvious one. But Intel x EK managed this with its Peltier cooler primarily thanks to an insulation shroud to keep this condensation away from the rest of the PC components and its Cryo Cooling software, which managed temperatures and power to keep condensation away.

Top CPU coolers

The best liquid coolers on a two-tone grey background

(Image credit: Future)

Best AIO cooler for CPUs: Keep your chip chill.
Best air cooler for CPUs: Classic, quiet cooling.

Back to the present, the main development from Samsung and the APL here seems to be in making things a lot smaller/thinner as well as scalable for mass production. The "nano-engineered thermoelectric materials", if unleashed into the wilds of the semiconductor cooling industry, could presumably mop up those two main drawbacks to Peltier cooling: energy consumption and heat production.

Of course, basic thermodynamics would tell us that if we're keeping temperatures lower than the surrounding air space then we can expect some additional heat and energy expenditure in doing so. But the trick will be minimising this as much as possible, and it looks like Samsung might have gone a way towards achieving this.

It'll be interesting to see how the technology develops as well as how it compares to phase change cooling tech.

]]>
https://www.pcgamer.com/hardware/cooling/samsungs-new-peltier-cooling-tech-is-more-efficient-and-less-wasteful-and-could-be-a-great-for-keeping-a-power-hungry-cpu-chilly/ A5iXgYFxg9aATsXxoEUPYH Wed, 28 May 2025 13:48:03 +0000
<![CDATA[ China has held the world's first robot martial arts tournament and I can't think of a single thing that could possibly go wrong ]]>

We can surely all agree there's absolutely nothing to be concerned about when it comes to robots and AI. So, it makes perfect sense to hold what's claimed to be the world's first martial arts tournament for robots. It's all completely harmless fun and games, nothing that remotely brings to mind Cyberdyne Systems Model 101 gone rogue. Nope.

Anyway, the China Media Group World Robot Competition Mecha Fighting Series reportedly kicked off—literally—on May 25 in Hangzhou, China. According to Asia Times, the tournament included Unitree Robotics G1 robots weighing in at 35 kilograms and 132 centimeters tall.

The G1 is actually available to buy from $16,000, just in case you want your own killing machine, sorry friendly household bot, and it comes with 3D LIDAR and two-hour battery life. Inevitably, the robots run AI models trained on data capture of the movements of kickboxers for the tournament, but it's not clear if that particular mapping is available to Unitree customers. We suspect not.

Each fight was made up of three rounds of two minutes each, with a punch scoring one point, and a kick three. Five points were deducted for falling over and 10 points if the bot failed to stand up within eight seconds.

Li Gaofeng, a researcher at Zhejiang University’s College of Control Science and Engineering, said, "combat fight is a difficult task for humanoid robots due to the intensive confrontation during the fight. Robots need to mind their movements and react to their opponent’s moves. All these requirements significantly challenge the robots’ algorithms, electronic parts and speed reducers.”

That said, the robots were not fully autonomous. Human operator teams controlled the robots, "in a human-machine collaborative way," according to Chen Xiyun of Unitree Robotics.

If all this sounds like a robot zombie apocalypse in the making, a quick scan of the Youtube highlights paints a slightly different picture. While some of the moves are impressive, more often the bots are flailing around, punching at thin air or tripping over themselves. It's more the stuff of comedy than nightmares.

Indeed, human overlords wielding some kind of remote controllers can be seen on the sidelines. So, it seems like the AI element is limited to the specifics of a given kick or punch in response to commands, very much like playing, ya know, a video game. We're a long way off bots that can do their own fighting thing, if this tournament is anything to go by.

That said, this stuff is undeniably developing fast and probably wouldn't have been possible at all, even with the existing caveats, a few years ago. Who's to say these things won't dancing around and then right out of the ring in a few years, fully capable of a deftly choreographed murderous rampage? What a time to be alive—for as long as you can outrun the robots...


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/china-has-held-the-worlds-first-robot-martial-arts-tournament-and-i-cant-think-of-a-single-thing-that-could-possibly-go-wrong/ mqf6jWEsDE4uWousYYVkmL Wed, 28 May 2025 13:15:04 +0000
<![CDATA[ As US tariff uncertainty continues, Nvidia's RTX 5090 dips under MSRP in the UK and EU ]]> As I type these very words, and less than six months after its official launch on January 30, it is at last possible to buy an RTX 5090 for under Nvidia's officially recommended retail price. Cue much rejoicing.

There is, of course, a catch. It all depends on where you live. In the UK, the 5090 has an MSRP of £1,889, but Overclockers UK will do you a Palit GeForce RTX 5090 GameRock 32GB for the piffling sum of £1,879.99.

OK, that's barely under list, but it's under list all the same and a far cry from the ridiculous markups that have been the norm for all too long. Those markups, sadly, still apply in the US, where the 5090's MSRP is ostensibly $1,999 but the GPU has scarcely, if ever, been seen at the price point.

Right now, at Newegg, for instance, the cheapest RTX 5090 is $2,919.99. And even that is progress of sorts. At least you can buy one.

Meanwhile, in Finland, the RTX 5090 has been spotted for 2,299 Euros, a whisker under the official 2,339 Euros EU sticker price. However, a quick search of proshop.fi, the etailer that listed the 5090 below MSRP, indicates that pricing is currently at the 2,399 Euros MSRP, not below it.

But even that is a pretty major advance on the massive markups that prevailed when the 5090 was launched. As for what to make of these developments in a wider context, these prices do seem to indicate that the GPU market is normalising at last.

After numerous shocks, including crypto mining and the pandemic, demand for GPUs and, therefore, pricing have been acutely elevated for years. However, in some territories, graphics cards are now widely available for MSRP.

Indeed, in the UK, pretty much the whole Nvidia RTX 50 lineup can be had at MSRP or below. That said, AMD's Radeon RX 9070 XT remains stubbornly pricey, with the cheapest examples in the UK commanding £660 or so, well above the £560 UK MSRP for the card.

Still, the broad trend is towards price normalisation. The US remains something of an exception. It's unclear how much of that is a direct result of tariffs. However, demand for graphics cards probably spiked as gamers and other GPU buyers rushed to snag cards before tariffs hit, layering on yet another extraordinary shock onto an already atypical market.

Should tariffs go back to normal and no other shocks hit the US market, with the broader supply of GPUs now looking pretty healthy, we'd expect even the US to see prices trend toward MSRP. But with the Trump administration's tariff policy shifting wildly on a literally daily basis, that's a rather big if.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/graphics-cards/as-us-tariff-uncertainty-continues-nvidias-rtx-5090-dips-under-msrp-in-the-uk-and-eu/ eQCaduKJ7NDrYNZ4yG6ihi Wed, 28 May 2025 11:11:40 +0000