Following initial impressions for Dirt 5 on Xbox Series X going live earlier today, the embargo has now dropped for Yakuza: Like a Dragon, which also arrives on launch day, November 10th.
Early previews seem to be largely positive for the game, but in terms of its Xbox Series X specific features, there doesn't appear to be much to shout about. It does, however, feature three different performance modes for you to choose from, allowing you to prioritise visuals, frame rate, or a mixture of both.
Here's what GameSpot had to say about the performance of each of them:
"When running the game on Series X, you'll have three enhancement options: high resolution, high frame rate, and normal. High resolution looks to be rendering true 4K for the best image quality, but the game will run at what appears to be around 30 FPS. High framerate mode brings down the resolution in order to run much higher FPS, although the drop in graphical quality is very noticeable.
I set the visuals to normal and the game seemed to hold a consistent 60 FPS throughout combat and exploration without sacrificing too much in visual detail--I found it to be the best option since it does offer smoother gameplay than what we've had from previous entries that use the same graphics engine."
Others have also echoed similar statements, with Press Start confirming that the normal mode will "suit most people", and is the default setting. The outlet also commented on "how much different" the high resolution and high frame rate options look.
Ultimately, it's going to be completely up to you which mode you prefer to use when the game launches on Xbox Series X|S this November. Which do you think you'll prioritise? Let us know below.
[source gamespot.com]
Comments 19
"I set the visuals to normal and the game seemed to hold a consistent 60 FPS".
So the high frame rate mode is 120fps?
@BlueOcean Not sure, I haven't seen anyone providing specifics yet.
I love the yakuza games but they have never been pushing the boundaries graphically. I would have thought you’d be getting 4K/60fps easily on the X, especially since it seems like there are very little if any graphic changes from the base game.
It’s a turn base RPG; 120fps seems a bit excessive.
So what's high frame rate mode for then if normal runs at 60FPS?
Hopefully this gets expanded.
this seems a bit confusing so far and there aren't many games seen yet. I can see this will be the defining take from next gen. How every game has several different modes with some differing levels of 'bits missing' or 'frames per second missing' or what different options the S will have versus the X.
I'm sure it'll all settle over time, but feels like its a beginners guide to having a gaming PC. Perhaps the following gen I'll just buy a £500 graphics card for my PC once trained in the ways of tech talk...
Bit disappointed, would had expected it to run 4k/60 without sacrifices given it is a prev gen game.
Not loving that this "next gen" is just a continuation of the mid-gen where it's not about a defined resolution and frame rate, but selecting "sucky graphics but great performance mode" or "next gen graphics, at the same old frame dropping standard" mode. Then again, normal mode is called normal for a reason. Sounds like it's 60fps but not 4k? Or dynamic resolution to hold framerate, and some vfx disabled. It's a "last gen" game with a "next gen" update, though, so it's not designed around the new hardware.
@StonyKL Assuming your motherboard has the right IO to take advantage of that $500 video card.....that's the conundrum of PC gaming if you're trying to max settings. The video card only matters if the mobo/IO can keep up with it which means upgrading the whole kit as often as consoles anyway. That's the mess that eventually drove me away from it and back to consoles once consoles caught up to running "PC games" decently or better for less money than I'd end up spending there.
@Tharsman the question for me is why isn’t it 4K/60fps? I’m guessing it’s the devs, if Valhalla can run at 4K/60fps this should be able to as well.
I've included the video from Press Start which shows the difference between the three modes... it's a good comparison!
@Medic_Alert I do understand what you’re saying and I completely understand that not every game can be native 4K/60fps. Let’s be honest though this is a PS4 game and with no graphical improvements on next gen consoles Id be expecting this to hit 4K/60fps easy.
I do agree with you though that they’ve probably blew their budget or just not bothered optimising it due to time and effort.
being a 'last gen' game doesn't mean a thing either. They can turn the polygon counts right up, add in some forms of Ray Tracing (reflections, global illumination etc), push level of detail right out and increase draw distance etc. There is so much that can be down to a current gen game that makes it difficult to hit a native 4k/60 on a next gen console - regardless of whether it has turn based combat or not.
There are 'numerous' things a dev may need to do beside dropping the resolution just to hit 30fps on current gen consoles. Even though you think it looks 'good', when you look close, you see that shadows are quite low resolution and drop in quality not too far away from the character, objects in the distance are 2D billboards to keep the polygon counts lower, Character models, vehicles etc are lower quality the further away to keep polygon counts lower etc. Its not just 'textures' that can drop in quality further away but objects, the way lighting works (shadows, reflections etc) and use of textures to create the illusion of geometric shapes to keep polygon counts lower - all of which can be replaced by higher quality assets - higher polygon count versions that they created anyway for when you get close enough to see them. Simply changing LoDs can have a big impact on performance.
Some games even have a setting which can increase/decrease the amount of 'objects' - like AI NPC's, vegetation/rocks, street objects (signs, lights, advertising boards, cars etc) which obviously has a big impact on Performance, increases polygon counts, CPU draw calls, AI processing etc. Dirt 5 has 3 different settings on Series X and each of those changes the number of people in the crowds (among other things).
If 'X' can run at 4k/60, then 'Y' should is a ridiculous statement too - for a start, they are different developers, different engines, different priorities etc - all of which makes a big difference. You don't know if Yakuza has much higher polygon counts for its characters - ALL the characters inc those more distant ones for example or how well optimised their engine is to do Shadows and ambient occlusion. Don't know what compromises to visual settings Ubisoft did to hit 4k/60 and whether or not the devs here were willing to compromise as much or settle for 30fps to push graphical settings higher...
Even a 3080 won't do a Native 4k/60 minimum with Ultra on all games. It can't do it with Flight Sim but that's also because of the CPU - another area where you don't know if Yakuza has more complex AI running for example.
With the new gen, you also don't know whether devs are fully up to speed and truly optimised the game specifically for the console or just ported it across and tweaked settings. Crysis is a classic example of a game that isn't optimised for multi-core, let alone multi-thread CPU as most of the workload is put on one core meaning that the others are doing very little. That one core is what causes bottlenecks because CPU's haven't got that much faster - got more cores, but not increased the clock frequency. The Xbox 360's CPU was running at 3.2Ghz compared to a 2.3Ghz CPU in a the most powerful console currently available. The 360 had twice the number of cycles as the Base PS4 CPU...
There is far too many unknowns to make silly comments like its a last gen game or if X can do it, then Y can....
@NEStalgia I'm not really sure I get what you're talking about. If you build a modern PC, the motherboard doesn't really mean much for the graphics card. If anything, it's the CPU to be concerned about going with the graphics card to avoid bottleneck.
As long as that's powerful enough, you can just get a $500 card no problem.
Sometimes it's not the hardware but the developer and/or engine. This game is horribly optimised.
@HollowGrapeJ Over the past generation, sure, but prior to that ever other graphics card cycle the bus, IO, or memory bus would change and would end up being the bottleneck (or not even support the newer cards at their real speed) without upgrading them. It seems it's been stagnant for a while and the new "norm" for PC is "oh, they never really change you just upgrade video cards." I suspect that weird calm period is ending with this new gen arrivng and it'll be stuck in the forever-upgrade of IO speeds and the like while chasing the PS5 type setup. I've been out of the PC gaming loop for a while, but when I was in it the video bus would change like every 18 months (ISA, PCI, AGP, AGP-Pro, AGP2, PCIx, PCIe, PCIe 1, PCIe2, PCIE3 ugh....)
There was like a 7 year period where it was stagnant and didn't change. I think newer PC gamers got a bit complacent with that, but it looks like it's picking up speed again. There was a new PCIe version last year, and a another new one slated for next year. Looks like we're back in a 2 year cycle for now. So each video card upgrade probably ends up requiring a new mobo for a new PCI bus again. And now we'll probably be in an every other year IO controller cycle, too. Gotta get PC up to PS5 SSD spec, of course!
@FraserG DF video up https://youtu.be/WJ9bODn8FFs
@mousieone I saw! Very insightful as always
@NEStalgia Well, I'm not sure when you were into it, but I don't think I've ever heard of it being that crazy these days. Unless your PC is really, really old, you shouldn't have much issue upgrading a single part like a GPU.
You do need a different motherboard depending on what they do with a new CPU, but a GPU? Most new cards can work with an older system. I think it'd be pretty dumb not too.
If you really went through that type of headache with PC gaming, then you can rest assured it's not like that anymore. SSDs dropped a lot in price, too. So, it won't be too much to upgrade to that either.
Well, it was right before that 7 year lull in dramatic changes. Jumped to PS3 & 360 early on and left PC behind. The lull is ending though. PCIe5 (2019) would be needed to take advantage of a 3090. PCIe6 is out next year that will probably be needed for next years cards. IO controllers will probably start changing around soon after PS5 sets that ball rolling. Sure a new card works...at reduced speed if you don't have the bus revision it needs. I think anyone that's been PC gaming since around 2010 but not before entered into an unusual lull period where the techtonics of it weren't shifting every time you turn your head. It seems like it's reverting to "back in my day" at present where it was always a dynamic landscape. Maybe not as bad as back then....but I don't think a mobo you buy today is going to be the mobo you want in 3 years if you plan on keeping up with video cards. We'll see if the era of needing new mobo, memory and CPU with each GPU change ends up returning or not if the landscape is unstable for some years. Then the "youngsters" can learn of my pain
When I dropped from it, SSD arrived, but was new, had very limited write cycle issues, were pretty unreliable, and astronomically expensive for tiny storage. Nobody really used them but the absolute bleeding edge at any cost people. (Obviously in productivity PCs I've switched long ago, but I mean for gaming purposes.)
Show Comments
Leave A Comment
Hold on there, you need to login to post a comment...