
There have been many conflicting thoughts about the Xbox Series S. Will it hold back this generation? Or will it simply allow more people to get on board? Fans, and even some developers seem to be split into two camps.
In a recent Game On Daily podcast, Dirt 5's technical director, David Springate, was asked about whether the Xbox Series S was an hinderance on development, to which he suggested it wasn't. As an owner of an Xbox Series S, Springate believes "it's absolutely fine in all the right places", adding that he loves his console.
He later noted that the system is a "great value proposition", and he doesn't "see it being a hinderance at all". Dirt 5 runs like a dream on the Xbox Series S, and looks beautiful, so it seems these comments are definitely well founded - but obviously it remains to be seen how next-gen titles will fare as the years go by.
If you haven't checked out Dirt 5, it's a perfect example of what the next-gen hardware can do. It's currently available through Xbox Game Pass, which Springate also said has resulted in a "shockingly huge" impact on the game's player base. Be sure to check it out if you're on the fence.
Do you think the Xbox Series S will be a hinderance this generation? Let us know in the comments below.
[source youtube.com]
Comments 21
I really don’t get any of the negativity toward the Series S and any dev saying its a hindrance is just being lazy. I love it and with game pass, xcloud beta and remote play only going to improve, series s will sell well. Not everyone can afford the X, i know I can’t and I am playing top games, spending little and all stunningly beautiful. Those who cant’t scale down slightly for the series s are seriously lazy. Series S is perfect for many people. Some devs just simply cannot be arsed!
Seems like it could become a hindrance later on but I can't imagine it being much of one at the moment.
Devs talking like they need to scale games down for Series S like it's a switch port.
I love mine, and will eventually upgrade to the Series X but it's an amazing little console - I just wish it had a little more memory.
How could this ever be a hindrance? Was the Xbox One a hindrance for the Xbox One X? Was the PlayStation 4 a hindrance for the PlayStation 4 Pro? Is a 1000$ PC a hindrance for a 2000$ PC? The answer to all of these questions is "NO". Underlying game engines are not completely rigid. Developers can decide which graphical features are disabled or enabled, and if enabled to what extend. That is a fact which is several decades old. Graphical presets, nothing new to see here.
Dear devs, some of you need to stop making up excuses and simply come clean. It's okay to not want to cover every system. Just say it, but don't lie! In the end, nothings stops you from just leaving out this or that device. You only have to decide, whether the convenience is worth the financial loss.
@lokozar gonna have to disagree with you there. Devs will likely target the lowest common denominator, then up res from there. Will this be a big deal? Now? No. In 5 years? Maybe. We’ll see.
@AgentGuapo
What the ... No! Why would they? Would you? And if yes, why the f*** would you do such nonsense?
If I know, I have two systems, and I want to impress with graphics (because that is what people expect from me), I'll take the more powerful device as my baseline, and then adjust (take away/lower graphical features, resolution etc.) from there, until it runs acceptable on the weaker device. Why the heck would I do it the other way around? That would either hamper my success on the more powerful device, OR make it much more complicated in the long run to add to the "high-end version". You can always cut something easily. Adding afterwards is a completely different beast.
@lokozar if I had to guess I'd say it's because in any given situation that lowest common denominator is going to be where the bulk of your player base is. And I suspect underneath all the creative decisions there's going to be a running cost/benefit analysis that skews development toward this majority.
Edit: but I reiterate, this is just a guess.
Only people saying it's gonna hold back next gen are Sony fanboys and Xbox haters..
@Bmartin001 Aren't Xbox fanboys the ones constantly bleating on about the Xbox series X being the most powerful console then suddenly power doesn't matter when it comes to the series S
@AgentGuapo Game design isn't based on the weakest GPU and scaled up visually from there. They may look at the weakest CPU and use that - as that is the 'brains' of a game.
The only major difference between the Series S and X is the GPU and RAM. What Dirt 5's Technical Director is saying is that this difference is negated by having a much lower 'Graphical' Quality. If you are targetting 4k, with UHD quality Textures, you need more GPU processing power and more RAM. Conversely, if you have a game on Series X, you can scale the graphical quality down to 1/3rd, use HD textures etc so don't need the RAM. Its not as if the Series S has far fewer CPU cores or doesn't support DX12 Ultimate features like VRS, Mesh Shading, Sampler Feedback etc or doesn't have dedicated Audio processing, dedicated decompression etc. The ONLY thing that the differences make is purely graphical.
Its not as if the Series S can't handle the AI, the Physics, the streaming of Data etc - the only 'compromises' a developer needs to do is scale 'down' the visuals - as they have ALWAYS done. Devs rarely have to scale up - unless they are making a remaster.
The fact the Series S is not targeting 4k means that they don't need as much processing power and because they aren't targeting 4k, they don't need 4k quality textures. That 'reduction' in both these areas is why the Series S is specced this way.
I don’t need 4K. I don’t need a disk drive. I don’t need 120fps. I do need full speed emulation.
Series S seems fine. It’s just like buying a low spec PC. It runs the games but not at quite as high resolutions.
@BAMozzy nice write up, thanks. Didn't realize the differences between X and S were so small. Which, only owning the S, kinda makes me feel better.
@Would_you_kindly The Series X is the most Powerful Console on the market and is targeting a 'higher' quality graphical presentation - hence the need for a powerful GPU and RAM.
The Series S doesn't compromise on features or on its CPU (I know its a bit slower but its still the exact same as in the Series X). The only areas that are 'compromised' is on the Graphics side. 1440p is less than 'half' the size of 2160p so you don't need the same processing power to run a game at 1440 as you do to run that 'same' game at 2160p with the same 'visual' settings. Some settings can be 'lowered' too at 1440p without impacting the overall image quality - LoD draw distances can be 'reduced' because you have fewer pixels to deliver 'details'. You also don't need 4k Textures - which are much larger in file size and require more RAM.
Point is, the differences of the Series S are because the Series S isn't trying to push the same Graphical Quality.
Its like the XB1 to XB1X or PS4 to PS4 Pro. Both mid Gen consoles were designed primarily to push the Graphical quality up, target 'higher' resolutions - so the main focus in these consoles was the GPU. The Pro had a minor 'buff' to its CPU and RAM - but considering that was a '4k' console with less and slightly slower RAM. XB1X bumped up the RAM so in 'some' games, the texture quality was much better on XB1X compared to Pro and the bigger GPU led to a 'higher' resolution.
This is much more like someone settling for a RTX3060 because they have a HD or 1440p monitor where as someone with a 4k one wants a 3080/90. Both GPUs will run the same games with the same features etc. The 3060 owner could get away with less RAM and a slightly slower CPU to get the 'SAME' performance just a lower res.
Power does matter if you want the 'very best' visuals and performance - but if you don't need that graphical power to render and process Ultra High Resolution visuals, then its pointless to buy the 3080. Maybe in 5yrs time, the 3060 is 'struggling' to do 1440p at 60fps with all the visual settings on high (or better) but the 3080 may well be struggling to do the same at 4k too.
Both Series Consoles have VRS, Mesh Shading, Sampler Feedback and, now, FidelityFX - this alone is more than enough to give it a MASSIVE advantage over previous generations. Both have the full feature set of DX12 Ultimate with the same architecture and 'chip' design - albeit smaller on the Series S due to a smaller GPU, both have the same CPU (so physics, AI, destruction etc) won't be impacted. Both have the same Data bandwidth so streaming in assets etc won't be affected...
I can't stress this enough, the only difference is the Graphical processing power. To go from say 1080/60 to 2160/60 requires a big jump up in GPU processing power. If 4k TV's had never come along, the Series S is more than capable of delivering 'next gen' gaming at 1080p but with 4k being the target for both PS5 and Series S, they need more power and RAM...
@zupertramp
That's not how it works. Again, neither the Xbox One, nor the PS4, nor cheaper PCs held down their respective more potent counterparts. Ever. That's because games were developed with the better system as baseline in mind.
Even in pure PC games development it always goes from high to low. You first code the game on/for powerful workstations, where the first versions look stunning, but run like crap, and then you tone it down and optimize around a targeted system composition. The rest is handled by presets and granular graphics settings. If it doesn't run smooth enough for your system, you're out of luck and know it's time to upgrade.
That, for example, is also the reason why you can find "hidden" switches, activated via config files, in some games, that make them look even better. It would be absolutely stupid to do it the other way around. Imagine where we would be, if developers always targeted systems which most of the people had at home. Take a look at e.g. Steam's database, to get an idea of how low that is and in fact always was. Games are a big driver for people to upgrade. It just takes the right one, from an individual's point of view. A whole industry revolves around that.
@lokozar yeah I imagined this was the case for PC but in my head it didn't make sense for consoles to have, say, the Pro in mind when setting out to make a game if at the end of the day sales are gonna come from base consoles but as is typically the case, I'm evidently wrong.
But still, it's hard not to feel like the base last gen consoles are holding Cyberpunk back in that resources are being spent trying to make that game run well on consoles that might have no business running it. Extreme and unique example but eh, it's there.
@zupertramp We will have to see exactly how things pan out over the generation but the design philosophy was to scale down the GPU in line with scaling down the Graphical presentation.
Mathematically speaking, you have about 1/3rd of the Computational capability of the Series X - but only rendering less than 'half' the image size. As we know, some games only render 'half' of 4k anyway each frame because they don't have the graphical power to render 'native' 4k - this is then combined with previous frames to deliver a reconstructed 4k image. 1440p with 'no' reconstruction needs less processing power - its a 'smaller' image, no reconstruction step etc so don't need the same power.
My 'concern' (not that I am 'concerned' really) is the long term - especially if we move to full RT Global Illumination for example (as Metro Exodus is with its new next gen update is). Being a much smaller GPU, can it handle enough Rays to bring a scene to life without too much 'noise'.
At the moment, RT GI is combined with standard rasterisation techniques so if you turn off RT, you fall back on the traditional look. We haven't heard anything about Minecraft RT for consoles - a fully path traced game that was shown running at 1080/30 on a Series X - is that not coming now because of Series S? If not, and it runs at 1080/30 on X, how does it run on Series S?
RT is a 'graphical' setting though and like I said, the Series S is a 'graphical' compromised console compared to the Series X so we will just have to see. Its much easier for devs to develop a game with purely RT GI - no need to bake in shadows, put fake lighting to simulate the bounces, no need to add Ambient Occlusion etc etc so that could eventually lead to issues BUT then we have other aspects, like Mesh Shading, VRS, ML etc etc that could free up resources to make it a non-issue
@zupertramp All games, whether on Console or PC, will start out with incredibly high quality assets - like character models for example. Extremely high polygon counts and detail that even a 3090 couldn't handle. From these, they make 'lower' and lower and lower quality versions (for different distances from the camera), make textures from to fill those 'polygons' to look like they have more polygons and details and make lower quality versions of those textures too. That's why in some games, you can see the point at which an object increases its polygon count and or 'textures' too. They don't start with a low quality Asset and scale it up.
With the Pro/PS4 being so close in terms of specs, then maybe you might scale it up, increase the resolution or implement some reconstructing method like CB rendering but then you can get some very soupy looking textures because they are made for HD displays and you don't have the RAM to stream higher quality and they take up more space.
The 'design' and ambition of Cyberpunk is perhaps too much for the old hardware. Again showing that that the game was built first and then attempted to scale down aspects as well as cut the frame rate down to the 'lowest' acceptable limit to give the hardware the most time to render and process a frame. Obviously, there is other issues that are impacting the game on last gen - let alone the bugs - but things like AI, Physics, data streaming, audio, decompression etc - all things 'normally' handled by the CPU also affect performance. The 'new' consoles and PC CPU's have vastly superior CPU's and in the case of the new consoles, also can offload Audio and Decompression to other areas. Cyberpunk is a classic example of trying to scale a game down to run on the weakest console - its certainly not been built to the XB1's spec and scaled up...
A lot of 1st Party console game developers may well build their game around that consoles specs - but they are still 'scaling' them down to 'run' on their hardware. They have the added advantage too of 'changing' their game to suit the hardware. For example, if you reach an area with extremely long distance views and the frame rate buckles, the Dev can just put a wall in obscuring most of the view to save the Frame Rate. On a multi-platform game where that area really needs to be the same, they may have to drop the resolution (DRS?), maybe decide to cap the frame rate lower etc to keep the same look across all versions. That's why 1st Party Console games can be a lot more 'stable' and the advantages of designing for a single platform.
In other words, if you are designing for a 'single' platform, you can actually make changes to the design and look of the game to make it run as you want on that hardware. You can change the entire scene if you can't get it running or cut it entirely and no-one would know.
If you are making a game for multiple platforms with a variety of hardware specs, you build the game first and then use various optimising techniques to get it to run on that hardware by reducing res, reducing frame rate, reducing visual settings and LoD draw distances etc etc. In some cases, they may well make changes - like Wolfenstein on Switch vs the rest, or if its impacting on more than just 1 piece of hardware, make a change to ALL versions - but generally, you build the game and scale/optimise to the hardware.
@zupertramp
I understand the concern. However, these are most likely not even the same resources. Under normal circumstances the developers divide the workload across several teams, with each team having its own project. For example, we heard that CDPR wants to pick up the Witcher franchise once again. You can bet on it, that they already are working on it, and that the teams are others than those working on Cyberpunk.
There is also a certain threshold were a team or the amount of teams working on one single problem can grow too much, meaning, they will lose efficiency - with one person going full steam, while the other is twiddling thumbs, because the first guy cannot split up a specific task. So, they cannot just pull one team from one project and slap it onto another. It's not that easy, unless the company is a very small one. Often when big developers give statements like "We pull the team, so that other projects profit from it", it's just damage control. I believe Anthem is one of the younger examples. What they do not want to say is: "Sorry guys, this game is done and there is no silver lining to it." They have to somehow sell their decision as beneficial to the customer. But now we're deep in marketing waters ... ugh ...
@lokozar @BAMozzy i don't really have a response but read your posts and didn't want them to go unacknowledged, since you took time to reply after all.
I enjoy my Series S and I think many other will enjoy it too. The majority of gamers will be playing rather than worrying if thier console is going to hold back other consoles.
I have owned both a Series S and a Series X at the same time.
I used both on the same 1080p screen. Not 4k.
First, I played Dirt 5 on Series S. It looked last gen.
Then, I played Dirt 5 on Series X. It is noticeably better, there are clearly a lot fewer jaggies. Also, the car textures are especially impressive.
Both on the same 1080p screen.
Show Comments
Leave A Comment
Hold on there, you need to login to post a comment...