Login or Sign Up
Posts 41 to 60 of 76
This is actually a very informative and interesting thread.
@Z3u5000 interesting thread. It seems like if anything is going to have an impact beyond just the resolution it will be the RAM.
“RAM: Well, this is another interesting area, as this isn’t what I was expecting. Truth be told, I was expecting it to be the same speed, just less. However, as you can see, RAM has not only dropped in size, but speed too. 23/41”
We know from analysis of the switch that one of the things that causes it to drop frames is the limited memory bandwidth. It will be interesting to see where the more demanding games like the medium drop more frames on the series S than the series X.
Edited on Fri 11th September, 2020 @ 21:08 by Ryall
@Ryall Since Switch neither has the GDDR6 memory nor the architecture generational optimisations, I don’t know how much of it is actually comparable. But we have to wait and see how it actually performs IRL. Lot of people who likely don’t have the hardware kit hands-on experience are speculating concerns.
@LtSarge I think that all the good work that Microsoft has done still brings them to equal or superior status with Sony on hardware and services front. Yes Gamepass is something sony doesn't have yet and they won't be able to compete with Series S price. But the strength of Sony lies in a relentless chain of big software releases.
A single Sony studio like Insomniac is bringing 2 games for PS5 already. Spiderman on launch (DLC or whatever, it works well as a next-gen demo) and R&C in launch window which looks incredibly polished. Meanwhile the entire XGS collective managed to muster enhanced Gears 5 and FH4 alongside a showing of Halo Infinite (a good 5 years in development) that became straight up meme material and needed at least 2021 to look good for shipping?
While the number of XGS is not small their output is far behind that of Playstation studios that already have GT7,Demon's Souls and Horizon sequel also in the pipeline (with a God of War game likely coming 2022 or 2023?) while XGS essentially has many promising titles in early or very early development.
I have played Spiderman, God of War, Horizon as well as Halo 4,5, Quantum Break, and Gears 4,5 and FM and FH games this gen. The truth is that only Gears and Forza had the graphics and that AAA exclusive polish that Sony was offering (QB was also amazing but it is bought exclusive by Remedy so not really the topic here). Outside of Turn 10 , Coalition and Playground Games which are easily comparable with Sony's first party studios the rest of XGS are only catching up. Obsidian, Ninja Theory and inExile are also big names but I doubt they release anything soon and they need to still prove themselves. Avowed is potentially huge but is it as big as Elder Scrolls or The Witcher 3 ? Hellblade 2 is huge but will it generate the same hype as the next God of War? Microsoft still has a lot to prove. Not to us dedicated Xbox fanbase ! but to millions of PS owners and people that are on the fence.
Edited on Sat 12th September, 2020 @ 14:22 by BrilliantBill
@BrilliantBill At the end of the day, first-party games won't really matter at all in the beginning of the next generation. We're still in the middle of a pandemic with hundreds of thousands of unemployed people and most likely more to come. A good chunk of those people still want to participate in the next generation of video games and they're going to be looking at the cheapest option, not which company is going to offer the most quality first-party exclusives down the line. And that's where Microsoft has the advantage with their cheap Series S as well as Game Pass Ultimate. In other words, quantity over quality and getting a lot of bang for your buck are what matters right now, so it doesn't even matter if Microsoft doesn't have many first-party exclusives coming out during the first year. Nobody who is looking to conserve as much money as possible is going to spend €60 on every new PlayStation first-party title. That's just not happening right now. And by the time the world has recovered from the pandemic, Microsoft is going to be hitting us with one first-party title after the other and early adopters are still going to experience these games through Game Pass. So if an unemployed person is looking for both a short-term and a long-term solution for their gaming needs, Microsoft is going to have that this holiday season. Sony won't though.
Edited on Sat 12th September, 2020 @ 09:29 by LtSarge
And this was back in February...
A pessimist is just an optimist with experience!
Why can't life be like gaming? Why can't I restart from an earlier checkpoint??
Feel free to add me but please send a message so I know where you know me from...
Xbox Gamertag: bamozzy
@BAMozzy Gotta admit I have never seen ultra smooth 120fps gameplay (even in Youtube‘s 60fps lock) running on a such tiny profile.
Edited on Mon 14th September, 2020 @ 07:55 by Senua
@Z3u5000 I want to know what resolution that's running at. I hear people shouting 1440p but not the 'educated' or knowledgeable people - ie the ones who have the expert analysis tools, been told in an official capacity or the dev team themselves.
The official reveal only stated that the footage was running at 120fps and it could be 1080p/120 with maybe 1440/60 for the single player campaign. Maybe there is some Dynamic Resolution scaling going on so it could be 'up to' 1440.
If its 'up to', I want to know what the 'average' is as well as the 'worst' case. Its not great if it says 'up to' 1080p, 1440p, 2160p etc but only ever hits that if you look directly at the floor or sky. The average is more important to me. You wouldn't want to play a game that says 'up to' 30fps but spends the vast majority of gameplay at 20fps.
Its the same with PS5, up to 3.5Ghz CPU, up to 2.23Ghz GPU. If the 'average' is 2.0Ghz GPU, that equates to 9.2TF - the 'rumoured' spec of the PS5 before the GDC announcement and only a 230Mhz drop in speed from its 'maximum' capability. For all I know, the CPU could be running at an 'average' of 3.3Ghz too - lower than the 3.4Ghz of the Series S and drop even lower at times of 'intense' GPU usage.
The phrase 'up to' really annoys me. Whether its in games, with hardware, with Broadband speeds etc etc. Your Broadband supplier could say 'up to' 80Mbps but you only get 40Mbps on average. I used to live in an area that was 'up to' 8Mbps and lucky to get 3.5Mbps and probably averaged around 2.5Mbps.
There is a big difference between a game running mostly at the target frame rate and/or resolution and a game that rarely hits those metrics. I think if a game 'averages' 900p and say 25fps, even if could hit 1080/30 in very specific situations (like looking at the sky), then its not a 1080/30 game. If the PS5 runs at 2.0Ghz 95% of the time, Its a 9.2TF console that has a 'boost' built in for 'specific' situations. If 90%+ of the time you are playing at 900p (or less), its not a 1080p game - even if it could hit 1080p....
Anyway, that's a bit off topic. I will still be interested to know what resolution Gears 5 runs at when running at 120fps. 1440p is not quite 'double' the pixel count of 1080p but still a LOT of pixels to render in just 8.333ms. I know it runs at up to 1080p and 60fps on a XB1S but it is using Dynamic Resolution scaling and Temporal Upscaling to 'hit' that 1080p, often dropping down quite a lot. The Series S of could still utilise that too - so it could be running at 1080p or less on occasion but use Temporal Upscaling to create a '1440p' image. But again, its that 'up to' situation. Like the Campaign on XB1X could drop to 1584p to keep the 60fps...
From Digital Foundry (Campaign but same Processing method used in MP)
As expected, X targets a native 4K at the upper bounds, while the base unit tops out at 1080p. That said, actual native rendering resolution is adjusting regularly during gameplay, producing results on X such as 1584p, 1728p, 2160p and the like. Xbox One S renders at quarter resolution by comparison - including values such as 792p and 864p up to full 1080p. The gap between S and X seems par for the course then, until you realise that the enhanced machine is essentially delivering twice the frame-rate.
However, native rendering resolution is just the start of the story. Gears 5 uses a temporal upscaling solution, increasing fidelity by drawing upon information from prior frames, so resolution values aren't quite as cut and dried as you may think.
Point is, I will be interested to know what Native resolution its running at to achieve 120fps on both Consoles - whether they can run Natively at the target, whether Temporal upscaling is still required and/or Dynamic Resolution Scaling - just out of interest - not that it matters to me as I will be buying a Series X anyway but its still some way to gauge the performance difference. 4x XB1 for example could be double the Resolution and double the Frame Rate - ie push out 2 frames at 2x the resolution (not quite as simple as that) so I wouldn't be surprised if the Resolution is 1440p using similar rendering methods to do so but 'maybe' they have opted for a 'native' 1080p instead of a Temporal Upscaled 1080p with perhaps less Dynamic Resolution scaling?
@BAMozzy It has been officially confirmed that Rainbow Six and probably Gears 5 multiplayer will run at 1440p120fps without many graphical compromises.
According to the official Series S video, this machine is targeted at 1440p upto 120fps native rendering.
We will soon find out once Digitalfoundry runs titles like Witcher 3 optimised later this year. Witcher 3 is an absolute beast and doesn’t perform well even on beefy machines.
Edited on Mon 14th September, 2020 @ 13:28 by Senua
I don't believe that the series s will deliver on their promise of the '1440p up to120 fps'. Looking again at that DF video, those games were barely at 60fps in 1080p, and future games will be even more demanding. I don't think that MS will use a magic trick to make those games run at 120fps on either system without some sort of heavy optimization. So, maybe the 120fps will only work on non-demanding indie titles and first party MS games with the optimizations, but not on the rest of the AAA games.
@Z3u5000 As I said above, there is a difference between targetting 1440p and running at 120fps for example and actually running NATIVELY at 1440p and a locked 120fps.
As Digital Foundry found out, Gears 5 does NOT run at native 1080p at 60fps (MP) on XB1S but can drop to 792p and also using Temporal Reconstruction (like Chequerboard Rendering). That means that even if the internal resolution isn't dropping, its still only running internally at 50% 1080p and using previous frames to reconstruct a 1920 x 1080p image.
I don't have any issues with using Reconstruction at all if done well. It can really boost the perceived image quality without impacting on frame rates. Native resolutions are not always the best use of resources at all but I would much prefer some transparency, some 'honesty'. A game ISN'T running at 1440p, 4k or whatever IF its running at 50% of that and using previous frames to construct a 1440p, 4k image. Horizon: Zero Dawn, God of War etc are NOT running at 4k, they may well be sending a 4k image to your TV, but the game is running at 50% 4k. I believe Rainbow Six too isn't running at 1080p on XB1, its running at 50% that and using previous frames to construct a 1080p image.
A native resolution is rendering every single pixel every single frame. Temporal reconstruction is rendering 'half' the pixels every frame and using previously rendered pixels essentially to fill in the gaps to create a full image - essentially rendering a 'full' image over 2 frames instead.
Gears 5 on XB1 uses temporal reconstruction and dynamic resolution scaling to maintain the target frame rate. There are times when it drops to 792p (so actually running at 50% of that) to maintain a 60fps target. If the Series S is 4x as powerful, then in theory, it could run 'Native' (as in drop the Temporal Reconstruction so rendering the whole image - not half) and double the frame rate - that's a 4x and with any extra resources from not having to reconstruct and from the extra power inside the Series S, could potentially be more 'stable' in resolution - not drop to 792p.
Of course they could still employ the same rendering techniques and go for a Temporal Reconstructed 1440p and with any extra resources, be more 'stable' - not drop internal resolution so much and/or improve visual settings. However, that still doesn't mean the game is running at 1440p but running at 50% 1440p at best.
Like I said, if you have a game that says it runs 'natively' at up to 1080p, but spends 95% of the time at 900p or less, that is not a 1080p game, it is NOT running at 1080p. If a game says it runs at 60fps but spends 95% of the time between 30 and 40fps, its NOT a 60fps game - that's just the 'cap'. If Gears 5 only ran at 70fps, capped at 120fps, its not a 120fps game - it obviously does run at 120fps but I was just using that for illustrative purposes. Being capped at a certain metric does not mean that it is at those metrics. By that, you could raise the cap on any game that barely hits 30fps to 120fps and claim its a 120fps game....
I am NOT trying to criticise Gears 5 but I want to know what the 'actual' performance metrics are. We can clearly see its pretty much a 120fps MP with maybe the odd dropped frame hear or there but is it still relying on Temporal upscaling to deliver a 1440p image? Is Dynamic Resolution Scaling still in effect too and if so, by how much? 80% on both axis? Up to 60%?
Again, I am NOT criticising - it makes more sense to use the GPU resources to improve visual presentation and/or render more frames per second than push out pixels to tick a 'native' tick box. I would rather have a Temporal Upscaled 4k than a native 1440p and I would rather have a Temporal Upscaled 4k at double (or more) the frame rate than a Native render could manage. I would rather have 1080p and use DLSS2.0 to upscale to 4k at 60 or 120 fps than a native 4k and 30fps. Point is, Native is NOT the best use of resources BUT I still want to know, want honesty. I have more respect for honesty and if ANY dev opts for some reconstruction or use some other techniques (like VRS, ML/AI upscaling etc) to create an image that 'looks' much higher resolution than the game is running at, I would rather they were honest about it.
@BAMozzy Well comparing mere spec sheet numbers between 2 huge generational hardware leaps like DDR3 vs GDDR6 efficiency and RDNA 2 vs GCN IPC improvements will be extremely difficult. There were lot of things even Digitalfoundry themselves couldn’t predict properly ahead of time for which they honestly confessed later. This also excludes several other optimisation features that are introduced in the Scarlet generation. Like I have said we will have to wait and see once Xbox Series S enhancement and optimisation program starts showing results. But my bets are on the high chances that we won’t be seeing any sub 1440p native rendering performance or “temporal reconstruction” in most cases.
Edited on Mon 14th September, 2020 @ 16:23 by Senua
@Z3u5000 The only way to assess the 'generational' leap is by comparing metrics in a like for like situation. Right now, with neither console available to purchase, so many people only have the current generation experience to go by and want to know what next gen will bring, whether a 4TF console can be 'enough' to deliver a next gen gaming experience. Obviously, its not going to be delivering a 4k/60 campaign like the Series X offers - even if it is using Temporal Reconstruction and dynamic resolution scaling - because its a 1440p console but, unlike the Series X, they can get a 120fps MP.
Its Metrics that help people, especially those with little technological understanding, gauge what the hardware specs will offer to them in terms of gaming experience. Again, to use TF's, if you know what 6TF means in relation to XB1X performance, 4TF's sound like its much less capable - less even than a PS4 Pro. The way to understand the improvements to efficiency and any optimisation built into the hardware, what all that actually translates to is how the same games now run. What improvements can be made to the visual settings, what resolutions and frame rates can now be achieved etc.
I know its not as simple as 4x the resolution requires 4x the power - there is a lot of other factors to consider. The easiest way to demonstrate generational improvements or even different hardware specs is by showing how much 'faster' a known game runs for example with the same settings. That's why Digital foundry uses the same games running at the same settings to illustrate the differences.
Its all well and good saying the Series S is a 4TF RDNA2.0 GPU based console but to understand what that means in relation to your current hardware is to compare the same game and assess the enhancements the team have managed. Obviously we are seeing a doubling of the frame rate and an increase in the 'target' resolution but we also know the XB1S wasn't running at a 'Native' 1080p and also could drop to 792p. Without adding in other enhancements - like more particle effects, longer draw distances etc that was confirmed for Series X, that's a 'bigger' jump than 4x without changing from a temporal upscaled image to native.
We are also looking to see what differences the Series X may have over a PS5 - similar architecture, different spec, different approaches and different customisation. The PS5 won't have DirectX RT because that's MS's, it won't have DirectML either for the same reason. I keep seeing Sony fanboys harping on about the PS5's Audio capability because Mark Cerney made a big deal out of it where as MS were far more reserved - but looking at the 'claims' by both sides, Sony's Audio is equivalent to their 8core PS4 Jaguar CPU (clocked at 1.6Mhz) and MS say theirs is more powerful than all 8 cores of their XB1X CPU (clocked at 2.3Ghz) so if accurate, that would make the Series X a lot more powerful, a lot more capable.
But until we have comparison, we just don't know - the PS5 could be far better optimised, far more efficient and be easier for Devs to get more out of it. Same with the rest of their hardware and the only way we will see what these specs mean in context will be to see how the SAME games run, the performance metrics and any differences in visual settings on each.
People want to know what the 'specs' translate to in terms of gaming. The best way to illustrate that is by comparing like for like. We will no doubt see comparison video's in the future, see how games like Assassin's Creed: Valhalla or Cyberpunk 2077 (when that gets next gen enhancements and even how it runs via BC) compare on PS4, PS4 Pro, PS5, Xbox One, Xbox One X, Series S and Series X - even see how they compare to a high end PC with 'Ultra' settings. No doubt be informed of whether any evidence of Temporal Upscaling or VRS is being used too, how any RT is being implemented and see side by side comparisons to show what differences exist, what devs may have had to scale down or implement (VRS, DRS etc) to achieve certain metrics - like higher resolution, higher frame rate etc modes.
Its no different from this gen where we saw what 'differences' the new gen offered over last gen at the start and what differences there was between both consoles. The difference the bigger GPU and faster RAM in the PS4 made despite a slower CPU. The Difference 6TF XB1X had over a 4.2TF Pro and how much of a difference that made over the 'base' consoles.
Its all well and good saying the 'new' gen is more efficient etc than old gen but the only way to demonstrate that is by directly comparing the same software and comparing the performance metrics - its also why both AMD and nVidia will also use games to illustrate that - show that a game runs faster on new hardware and by how much - even if they do 'cherry pick' certain software to illustrate it.
@BAMozzy There are lot of nuances in comparing raw hardware specs and stuff like “development efficiency and optimisation ease” etc. when in fact these consoles/platforms have been originally targeted for different launch timelines. There’s no doubt the development and software tools absolutely need time to get matured and improve to provide a more stable and easy experience for developers. We are still very early to speculate something like what’s more “easy to optimise”. The new Gamecore dev platform has been (still) developed to build next-gen titles for Scarlett family of consoles also provides ease of scalability and optimisation across next-gen hardware platforms. Going by what Microsoft’s track record of delivering high quality developer tools and mature ecosystem (which actually brings most of its billion $s), this remains to be beyond anyone’s worry honestly. With upcoming features like DirectStorage API, next gen game designs are set to fundamentally change forever as developers won’t need to store redundant assets in nearby sectors of spinning disks. This alone will cut down game sizes. Couple this with the fact that Series S is constantly able to change its memory data on the fly every second with far superior memory usage over last-gen, next-gen performance just can not be predicted in full extent.
Edited on Mon 14th September, 2020 @ 21:07 by Senua
You are missing the point. All of that is more technological babble and many console gamers don't really care about all of that, they only want to know what the games will look and run like and how much 'better' they are compared to what they are getting now.
There are some that don't even care too much about that. There is quite a few I have seen comment that they don't see the point in buying a Series S/X because they can play the games on their Xbox One. Even the X/Pro line wasn't an 'easy' sell to a lot of people because as far as they were concerned, they didn't have a 4k TV so there was 'no' advantages to them at all.
For a lot of people, they want to know what upgrades the next gen offers over the current gen version - especially if the majority of games releasing will also be available on their current console. If they can play Cyberpunk, Far Cry, Call of Duty, Fifa and Assassins Creed, what difference will playing them on Series S/X make - is it 'worth' spending £250/£450 on or better to wait until these consoles have enough games released they can't play (games like the Medium, Scorn, STALKER 2, Warhammer as well as Forza and Fable it seems) release.
I doubt they care too much about whether or not it has an SSD inside, they care more about the 'storage' capacity and what difference it makes and the only way to show the difference is to compare loading times for example - show the 'big' difference in those and it has 'more' impact than a generic 'faster loading' statement. Its difficult to illustrate what 120fps offers but comparing the input lag measurements, even if they don't really know what that 'feels' like, it can illustrate the difference and give them some 'idea' because they have the current gen as a reference point.
Visual comparisons do far more than just 'specs' or statements alone. Talking about the TFlops, Velocity engine, Asset management, decompression etc is almost meaningless until you compare a game running on current and next gen hardware. Show how much quicker something loads, show and highlight the visual differences etc. Its like showing the OG version of a game and then swapping to a remaster - it says far more than a list of changes.
There are some that have a better understanding of the hardware and generational improvements that mean things like TF's are not 'linear' - ie 12TF is not 2x better than 6TF (unless its the exact same generation architecture) but when something seems a LOT lower, like 4TF Series S vs 6TF XB1X or even 4.2TF PS4 Pro, its important to show that 4TF is better. Especially when MS are saying its a 1440p console and not offering XB1X BC (that's more to do with RAM though) because they are using the GPU power to pump out frames rather than pixels.
Comparisons will do so much more than spec sheets, words etc. I know MS had a bad E3 2013, but that was compounded by comparisons between PS4 and XB1 versions of the game, showing the differences in resolution and performance. It showed what the difference in GPU and RAM actually meant.
Question, does anybody know if halo mcc is being optimized for the xsx/xss
“The way that we designed the developer environment was that a developer would ideally target 4K at 60 fps, up to a 120 fps on Xbox Series X, and then they could easily scale down to the Xbox Series S by reducing the rendering resolution to 1440p,” Ronald said.
“There are also opportunities where we can enhance the titles on Xbox Series S even further than what we can do on Xbox One X,” says Ronald. “If you look at the raw power of the Xbox Series S, if a title wants to go in and double its frame rates it’s actually really easy, because we’ve more than doubled the GPU performance and more than doubled the CPU performance, so it’s relatively easy for a developer to go in and enable that if they choose to update their title.”
“With RDNA 2 we get basically a 25 percent performance uplift over GCN with no work by developers at all,” claims Ronald. “There’s a significant amount of efficiency we’re getting out of RDNA2 relative to GCN. Then we look at other things like using float 16 or variable rate shading, and we’re seeing on the order of 10-20 percent performance benefits from there as well.”
“It is really difficult to compare raw numbers like teraflops across generations, because we think about them differently,” says Ronald. “In general on the GPU side the Xbox Series S is effectively the same performance as an Xbox One X GPU, but it brings all the next-gen features like ray tracing, VRS, mesh shaders, and obviously when you look at the massive leaps in CPU performance and I/O performance, that’s why Xbox Series S is designed to deliver that true next-gen experience just targeting a lower resolution than Xbox Series X.”
4K is supported on the Xbox Series S through upscaling, and Microsoft is doing work at the display controller level to get higher image quality and improved color correction for crisper and more accurate images when games are being upscaled.
Edited on Thu 17th September, 2020 @ 15:46 by Senua
Please login or sign up to reply to this topic