The negativity towards this is wild. A company followed relatively widely accepted industry practice (lots and lots of other games also have huge sizes on disk for the exact same reason), then eventually they decided to do their own independent testing to check whether said common practice actually makes things better or not in their case, found that it didn't, so they reversed it. In addition, they wrote up some nice technical articles on the topic, helping to change the old accepted industry wisdom.
This seems great to me. Am I crazy? This feels like it should be Hacker News's bread and butter, articles about "we moved away from Kubernetes/microservices/node.js/serverless/React because we did our own investigation and found that the upsides aren't worth the downsides" tend to do really well here. How is this received so differently?
Arrowhead probably deserves more love for breaking the norm but I think it's overshadowed by people finding out for the first time the reason HDDs are so common in gaming setups is companies have been blindly shaving a few seconds off HDD load time off at the cost of 7x the disk space.
If it had been more well known this was the cause of game bloat before then this probably would have been better received. Still, Arrowhead deserves more credit both for testing and breaking the norm as well as making it a popular topic.
Part of what makes this outrageous is that the install size itself is probably a significant part of the reason to install the game on an HDD.
154GB vs 23GB can trivially make the difference of whether the game can be installed on a nice NVMe drive.
Is there a name for the solution to a problem (make size big to help when installed on HDD) in fact being the cause of the problem (game installed on HDD because big) in the first place?
Can any games these days be reliably ran on hdd's with max 200mb/s throughout (at best)? Or does everyone get a coffee and some cookies when a new zone loads? Even with this reduction that will take a while.
I thought all required ssd's now for "normal" gameplay.
Until you get to super-high-res textures and the like, the throughput isn't nearly as important as the latency.
At 200 MB/s the way hard drives usually measure it, you're able to read up to 390,625 512-byte blocks in 1 second, or to put it another way, a block that's immediately available under the head can be read in 2.56 microseconds. On the other hand, at 7200 RPM, it takes up to 8.33 milliseconds to wait for the platter to spin around and reach a random block on the same track. Even if these were the only constraints, sequentially arranging data you know you'll need to have available at the same time cuts latency by a factor of about 3000.
It's much harder to find precise information about the speed of the head arm, but it also usually takes several milliseconds to move from the innermost track to the outermost track or vice versa. In the worst case, this would double the random seek time, since the platter has to spin around again because the head wasn't in position yet. Also, since hard drives are so large nowadays, the file system allocators actually tend to avoid fragmentation upfront, leading to generally having few fragments for large files (YMMV).
So, the latency on a hard drive can be tolerable when optimized for.
> On the other hand, at 7200 RPM, it takes up to 138 microseconds to wait for the platter to spin around and reach a random block on the same track.
You did the math for 7200 rotations per second, not 7200 rotations per minute = 120 rotations per second.
In gaming terms, you get at most one or two disk reads per frame, which effectively means everything has to be carefully prefetched well in advance of being needed. Whereas on a decade-old SATA SSD you get at least dozens of random reads per frame.
My immediate question is that if all of that was on-disk data duplication, why did it affected download size? Can't small download be expanded into optimal layout on the client side?
It didn't. They downloaded 43 GB instead of 152 GB, according to SteamDB: https://steamdb.info/app/553850/depots/ Now it is 20 GB => 21 GB. Steam is pretty good at deduplicating data in transit from their servers. They are not idiots that will let developers/publishers eat their downstream connection with duplicated data.
Sure it can - it would need either special pre- and postprocessing or lrzip ("long range zip") to do it automatically. lrzip should be better known, it often finds significant redundancy in huge archives like VM images.
It would be one thing if it was a 20% increase in space usage, or if the whole game was smaller to start with, or if they had actually checked to see how much it assisted HDD users.
But over 6x the size with so little benefit for such a small segment of the players is very frustrating. Why wasn't this caught earlier? Why didn't anyone test? Why didn't anyone weigh the pros and cons?
It's kind of exemplary of HD2's technical state in general - which is a mix of poor performance and bugs. There was a period where almost every other mission became impossible to complete because it was bugged.
The negativity is frustration boiling over from years of a bad technical state for the game.
I do appreciate them making the right choice now though, of course.
It was a choice, not an oversight. They actively optimised for HDD users, because they believed that failing to do so could impact load times for both SSD and HDD users. There was no speed penalty in doing so for SSD users, just a disk usage penalty.
Helldivers II was also much smaller at launch than it is now. It was almost certainly a good choice at launch.
You make a million decisions in the beginning of every project. I'm certain they made the choice to do this "optimization" at an early point (or even incidentally copied the choice over from an earlier project) at a stage where the disk footprint was small (a game being 7GB when it could've been 1GB doesn't exactly set off alarm bells).
Then they just didn't reconsider the choice until, well, now.
Even at the end of development it’s a sensible choice. It’s the default strategy for catering to machines with slow disk access. The risk of some players experiencing slow load times is catastrophic at launch. In absence of solid user data, it’s a fine assumption to make.
The first impression matters is the thing. This was John Carmacks idea on how to sell interlacing to smartphone display makers for VR: the upsell he had was that there's one very important moment when a consumer sees a new phone: they pick it up, open something and flick it and that scroll effect better be a silky smooth 60 FPS or more or there's trouble. (His argument was making that better would be a side effect of what he really wanted).
>But over 6x the size with so little benefit for such a small segment of the players is very frustrating. Why wasn't this caught earlier? Why didn't anyone test? Why didn't anyone weigh the pros and cons?
Have you never worked in an organization that made software?
Damn near everything can be 10x as fast and using 1/10th the resources if someone bothered to take the time to find the optimizations. RARE is it that something is even in the same order of magnitude as its optimum implementation.
I think what makes this a bit different from the usual "time/value tradeoff" discussion is bloating the size by 6x-7x was the result of unnecessary work in the name of optimization instead of lack of cycles to spend on optimization.
Eh probably not, it's probably handled by some automated system when making release builds of the game. Sure, implementing that initially was probably some work (or maybe it was just checking a checkbox in some tool), but there's probably not much manual work involved anymore to keep it going.
Reverting it now though, when the game is out there on a million systems, requires significant investigation to ensure they're not making things significantly worse for anyone, plus a lot of testing to make sure it doesn't outright break stuff.
Optimization takes up time, and often it takes up the time of an expert.
Given that, people need to accept higher costs, longer development times, or reduced scope if they want better optimized games.
But what is worse, is just trying to optimize software is not the same as successfully optimizing it. So time and money spent on optimization might yield no results because there might not be anymore efficiency to be gained, the person doing the work lacks the technical skill, the gains are part of a tradeoff that cannot be justified, or the person doing the work can't make a change (i.e., a 3rd party library is the problem).
The lack of technical skill is a big one, IMO. I'm personally terrible at optimizing code, but I'm pretty good at building functional software in a short amount of time. We have a person on our team who is really good at it and sometimes he'll come in after me to optimize work that I've done. But he'll spend several multiples of the time I took making it work and hammering out edge cases. Sometimes the savings is worth it.
The trade off they're talking about is to arrive at the same end product.
The reason games are typically released as "fetuses" is because it reduces the financial risk. Much like any product, you want to get it to market as soon as is sensible in order to see if it's worth continuing to spend time and money on it.
> God why can’t it just be longer development time.
Where do you stop? What do the 5 tech designers do while the 2 engine programmers optimise every last byte of network traffic?
> I’m sick of the premature fetuses of games.
Come on, keep this sort of crap off here. Games being janky isn't new - look at old console games and they're basically duct taped together. Go back to Half-life 1 in 1998 - the Xen world is complete and utter trash. Go back farther and you have stuff that's literally unplayable [0], or things that were so bad they literally destroyed an entire industry [1], or rendered the game uncompleteable [2].
Super Mario 64, widely recognized as one of the most iconic influential games ever... was released with a build that didn't have the compiler optimizations turned on. They proved this by decompiling it and with the exact right compiler and tools recompiling it with the non-optimized arguments. Recompiling with the optimizations turned on resulted in no problems and significant performance boosts.
One of the highest rated games ever released without devs turning on the "make it faster" button which would have required approximately zero effort and had zero downsides.
This kind of stuff happens because the end result A vs. B doesn't make that much of a difference.
And it's very hard to have a culture of quality that doesn't get overrun by zealots who will bankrupt you while they squeeze the last 0.001% of performance out of your product before releasing. It is very had to have a culture of quality that does the important things and doesn't do the unimportant ones.
The people who obsess with quality go bankrupt and the people who obsess with releasing make money. So that's what we get.
A very fine ability for evaluating quality mixed with pragmatic choice for what and when to spend time on it is rare.
But this isn't an optimization. The 150+GB size is the "optimization", one that never actually helped with anything. The whole news here is "Helldivers 2 stopped intentionally screwing its customers".
I don't see why it's a surprise that people react "negatively", in the sense of being mad that (a) Helldivers 2 was intentionally screwing the customers before, and (b) everyone else is still doing it.
At one point, I think it was TitanFall2, the pc port of a game deliberately converted it's audio to uncompressed wav files in order to inflate the install size, They said it was for performance but the theory was to make it more inconvenient for pirates to distribute.
When the details of exactly why the game was so large came out, many people felt this was a sort of customer betrayal, The publisher was burning a large part of the volume of your precious high speed sdd for a feature that added nothing to the game.
People probably feel the same about this, why were they so disrespectful of our space and bandwidth in the first place? But I agree it is very nice that they wrote up the details in this instance.
> They said it was for performance but the theory was to make it more inconvenient for pirates to distribute.
This doesn't even pass the sniff test. The files would just be compressed for distribution and decompressed on download. Pirated games are well known for having "custom" installers.
>The files would just be compressed for distribution and decompressed on download
All Steam downloads are automatically compressed. It's also irrelevant. The point is that playback of uncompressed audio is indeed cheaper than playback of compressed audio.
> The point is that playback of uncompressed audio is indeed cheaper than playback of compressed audio.
Even when Titanfall 2 was released in 2016, I don't think that was meaningfully the case. Audio compression formats have been tuned heavily for efficient playback.
> When the details of exactly why the game was so large came out, many people felt this was a sort of customer betrayal, The publisher was burning a large part of the volume of your precious high speed sdd for a feature that added nothing to the game.
Software developers of all kinds (not just game publishers) have a long and rich history of treating their users' compute resources as expendable. "Oh, users can just get more memory, it's cheap!" "Oh, xxxGB is such a small hard drive these days, users can get a bigger one!" "Oh, most users have Pentiums by now, we can drop 486 support!" Over and over we've seen companies choose to throw their users under the bus so that they can cheap out on optimizing their product.
I remember seeing warez game releases in the late 90s that had custom packaging to de-compress sound effects that were stored uncompressed in the original installer.
It seems no one takes pride in their piracy anymore.
It's because shitting on game devs is the trendy thing these days, even among more technically inclined crowds unfortunately. It seems like there's a general unwillingness to accept that game development is hard and you can't just wave the magic "optimize" wand at everything when your large project is also a world of edge cases. But it seems like it should be that simple according to all the armchair game devs on the internet.
The level of work that goes into even “small” games is pretty incredible. When I was a grad student another student was doing their (thesis based, research focused) masters while working at EA on a streetfighter(?) game.
The game programming was actually just as research focused and involved as the actual research. They were trying to figure out how to get the lowest latency and consistency for impact sounds.
the engineers disease: "i'm smarter than you and I need to prove it, and we're so smart we wouldn't have shipped this code in the first place" bla bla bla
also keep in mind that modern gaming generates more revenue than the movie industry, so it's in the interests of several different parties to denigrate or undermine any competing achievement -- "Bots Rule Every Thing Around Me"
For me it's not so much about shitting on game devs as it is about shitting on the ogres that run game companies. Any of us who have done development should understand we have little control over scope and often want to do more than the business allows us to.
That is completely ok in my opinion. It's just most discourse I come across treats the developers as complete amateurs who don't know what they're doing. As someone who's a professional dev myself I just can't get behind bashing the people doing the actual work when I know we're all dealing with the same business realities, regardless of industry.
Meh, the same is true for almost every discussion on the internet, everyone is an expert armchair for whatever subject you come across, and when you ask them about their experience it boils down to "I read lots of Wikipedia articles".
I mean I agree with you, that it is trendy and seemingly easy, to shit on other people's work, and at this point it seems to be a challenge people take up upon themselves, to criticise something in the most flowery and graphic way as possible, hoping to score those sweet internet points.
Since maybe 6-7 years I stopped reading reviews and opinions about newly launched games completely, the internet audience (and reviewers) are just so far off base compared to my own perspective and experience that it have become less than useless, it's just noise at this point.
There has long been a trend that "software engineers" and "computer scientists" both have been rather uninterested in learning the strategies that gaming developers use.
Really, the different factions in software development are a fascinating topic to explore. Add embedded to the discussion, and you could probably start fights in ways that flat out don't make sense.
Many players perceive Arrowhead as a pretty incompetent and untrustworthy developer. Helldivers has suffered numerous issues with both performance and balancing. The bugs constantly introduced into the game (not the fun kind you get to shoot with a gun) have eroded a lot of trust and good will towards the company and point towards a largely non-existent QA process.
I won’t state my own personal views here, but for those that share the above perspective, there is little benefit of the doubt they’ll extend towards Arrowhead.
The negativity comes from the zero effort they put into this prior to launch. Forcing people to download gigs of data that was unnecessary.
Game studio's no longer care how big their games are if steam will still take them. This is a huge problem. GTA5 was notorious for loading json again, and again, and again during loading and it was just a mess. Same for HD2, game engines have the ability to only pack what is used but its still up to the developers to make sure their assets are reusable as to cut down on size.
This is why Star Citizen has been in development for 15 years. They couldn't optimize early and were building models and assets like it's for film. Not low poly game assets but super high poly film assets.
The anger here is real. The anger here is justified. I'm sick of having to download 100gb+ simply because a studio is too lazy and just packed up everything they made into a bundle.
> They couldn't optimize early and were building models and assets like it's for film. Not low poly game assets but super high poly film assets.
Reminds me of the Crack.com interview with Jonathan Clark:
Adding to the difficulty of the task, our artist had no experience in the field. I remember in a particular level we wanted to have a dungeon. A certain artist begin by creating a single brick, then duplicating it several thousand times and building a wall out of the bricks. He kept complaining that his machine was too slow when he tried to render it. Needless to say this is not the best way to model a brick wall.
this is very very common as there's only a handful of school that teach this. Displacement mapping with a single poly is the answer. Game dev focused schools have this but any other visual media school it's "build a brick, array the brick 10,000 times".
There were 20 people working on this game when they started development. Total. I think they expanded to a little over 100. This isn't some huge game studio that has time to do optimization.
Size of team has no bearing in this argument. Saying they were small so they get a pass at preventing obscene download sizes is like saying “Napster was created by one man, surely he shouldn’t be accountable” but he was.
When making a game, once you have something playable, is to figure out how to package it. This is included in that effort. Determining which assets to compress, package, and ship. Sometimes this is done by the engine. Sometimes this is done by the art director.
This isn’t a resourcing issue. It’s a lack of knowledge and skipped a step issue.
When I did this. My small team took a whole sprint to make sure that assets were packed. That tilemaps were made. That audio files were present and we did an audit to make sure nothing extra was packaged on disk. Today, because of digital stores and just releasing zip files, no one cares what they ship and often you can see it if you investigate the files of any Unity or Unreal engine game. Just throw it all over the fence.
> Probably because many are purists. It is like how anything about improving Electron devolves into "you shouldn't use Electron."
The Electron debate isn't about details purism, the Electron debate is about the foundation being a pile of steaming dung.
Electron is fine for prototyping, don't get me wrong. It's an easy and fast way to ship an application, cross-platform, with minimal effort and use (almost) all features a native app can, without things like CORS, permission popups, browser extensions or god knows what else getting in your way.
But it should always be a prototype and eventually be shifted to native applications because in the end, unlike Internet Explorer in its heyday which you could trivially embed as ActiveX and it wouldn't lead to resource gobbling, if you now have ten apps consuming 1GB RAM each just for the Electron base to run, now the user runs out of memory because it's like PHP - nothing is shared.
Each person seems to have their own bugbear about Electron but I really doubt improving Electron to have shared instances a la WebView2 would make the much of a dent in the hate for it here.
Or these devs & users can migrate to a PWA. Which will have vastly less overhead. Because it is shared, because each of those 10 apps you mention would be (or could be, if they have ok data architecture) tiny.
PWAs have the problem that for every interaction with the "real world" they need browser approval. While that is for a good reason, it also messes with the expectations of the user, and some stuff such as unrestricted access to the file system isn't available to web apps at all.
The negativity wasn't created in a vacuum. ArrowHead has a long track record of technical mishaps and a proven history of erasing all evidence about those issues, without ever trying to acknowledge them. Reddits, Discord and YouTube comment section are heavily moderated. I suspect there's might be a 3rd party involved in this, which doesn't forward any technical issues, if the complaint involves any sign of frustration. Even the relation with their so called "Propaganda Commanders" (official moniker for their youtube partner channels) has been significantly strained in two cases, for trivialities.
It took Sony's intervention to actually pull back the game into playable state once - resulting in the so called 60 day patch.
Somehow random modders were able to fix some of the most egregiously ignored issues (like an enemy type making no sound) quickly and effectively. ArrowHead ignored, then denied, then used the "gamers bad" tactic, banned people pointing it out. After long time, finally fixing it and trying to bury it in the patch notes too.
They also have been caught straight up lying about changes, most recent one was: "Apparently we didn't touch the Coyote", where they simply buffed enemies resistance to fire, effectively nerfing the gun.
Sony nearly killed all good will the game had accrued when they tried to use the massive player base as an opportunity to force people into their worthless ecosystem. I don't think Sony even has the capability to make good technical decisions here, they are just the publisher. It was always Arrowhead trying to keep up with their massive success that they clearly weren't prepared for at all. In the beginning they simply listened to some very vocal players' complaints, which turned out to not be what the majority actually wanted. Player driven development is hardly ever good for a game.
- To their PC not reboot and BSOD (was a case few months ago)
- Be able to actually finish a mission (game still crashes a lot just after extraction, it's still rare for the full team to survive 3 missions in a row)
- Be able to use weapon customisation (the game crashed, when you navigated to the page with custom paints)
- Continue to run, even when anybody else from the team was stimming (yes, any person in the team stimming caused others to slow down)
- Actually be able to hear one of the biggest enemies in the game
- To not issue stim/reload/weapon change multiple times, for them just to work (it's still normal to press stim 6 times in some cases, before it activates, without any real reason)
- Be able to use chat, when in the vehicle (this would result in using your primary weapon)
- Be able to finish drill type mission (this bugs out a lot still)
- Not be attacked by enemies that faze through buildings
- Not be attacked by bullets passing through terrain, despite the player bullets being stopped there
are just vocal player's complaints? A lot of those bugs went totally unaddressed for months. Some keep coming back in regressions. Some just are still ongoing. This is only a short list of things I came across, while casually playing. It's a rare sight to have a full OP without an issue (even mission hardlocks still).
About Sony - I specifically referred the Shams Jorjani's (CEO of ArrowHead) explanation to Hermen Hulst (the head of PlayStation Studios) why the review score collapsed to 19%, among other issues.
As someone with 700 hours in the game, I've played the game both on Windows and Linux.
A lot of issues are to do with the fact that the game seems to corrupt itself. If I have issues (usually performance related), I do a steam integrity check and I have zero issues afterwards. BTW, I've had to do this on several games now, so this isn't something that is unique to HellDivers. My hardware is good BTW, I check in various utils and the drives are "ok" as far as I can tell.
> - To their PC not reboot and BSOD (was a case few months ago)
This was hyped up by a few big YouTubers. The BSODs was because their PCs were broken. One literally had a burn mark on their processor (a known issue with some boards/processor combos) and the BSODs went away when they replaced their processor. This tells me that there was something wrong with their PC and any game would have caused a BSOD.
So I am extremely sceptical of any claims of BSODs because of a game. What almost is always the case is that the OS or the hardware is at issue and playing a game will trigger the issue.
If you are experiencing BSODs I would make sure your hardware and OS are actually good, because they are probably not. BTW I haven't a BSOD in Windows for about a decade because I don't buy crap hardware.
> - Be able to actually finish a mission (game still crashes a lot just after extraction, it's still rare for the full team to survive 3 missions in a row)
False. A few months ago I played it for an entire day and the game was fine. Last week I played it a good portion of Saturday night. I'm in several large HellDivers focused Discord servers and I've not heard a lot of people complaining about it. Maybe 6 months ago or a year ago this was the case, but not now.
> Be able to use weapon customisation (the game crashed, when you navigated to the page with custom paints)
This happened for like about a week for some people and I personally didn't experience this.
> To not issue stim/reload/weapon change multiple times, for them just to work (it's still normal to press stim 6 times in some cases, before it activates, without any real reason)
I've not experience this. Not heard anyone complain about this and I am in like 4 different HellDivers focus'd discord servers
> Not be attacked by enemies that faze through buildings
This can be annoying, but it happens like once in a while. It isn't the end of the world.
> So I am extremely sceptical of any claims of BSODs because of a game.
Generally speaking, I am too. That is unless there is kernel-level anticheat. In that case I believe it's fair to disregard all other epistemological processes and blame BSODs on the game out of principle
> In that case I believe it's fair to disregard all other epistemological processes and blame BSODs on the game out of principle
I am sorry but that is asinine and unscientific. You should blame BSODs on what is causing them. I don't like kernel anti-cheat but I will blame the actual cause of the issues, not assign blame on things which I don't approve of.
I am a long time Linux user and many of the people complaining about BSODs on Windows had a broken the OS in one way or another. Some were running weird stuff like 3rd party shell extensions that modify core DLLs, or they had installed every POS shovelware/shareware crap. That isn't Microsoft's fault if you start running an unsupported configuration of the OS.
Similarly. The YouTubers that were most vocal about HellDivers problems did basically no proper investigation other than saying "look it crashed", when it was quite clearly their broken hardware that was the issue. As previously stated their CPU had a burn mark on one of the pins, some AM5 had faults that caused this IIRC. So everything indicated hardware failure being the cause of the BSOD. They still blamed the game, probably because it got them more watch time.
During the same time period when people were complaining about BSODs, I didn't experience one. I was running the same build of the game as them and playing on the same difficulty and sometimes recording it via OBS (just like they were). What I didn't have was a AM5 motherboard, I have and older AM4 motherboard which doesn't have these problems.
> > - Be able to actually finish a mission (game still crashes a lot just after extraction, it's still rare for the full team to survive 3 missions in a row)
> False. A few months ago I played it for an entire day and the game was fine. Last week I played it a good portion of Saturday night. I'm in several large HellDivers focused Discord servers and I've not heard a lot of people complaining about it. Maybe 6 months ago or a year ago this was the case, but not now.
I specifically mean the exact time, right after the pelican starts to fly. I keep seeing "<player> left" or "disconnected". Some come back and I have a habit of asking: "Crash?", they respond with "yeah"
> - To their PC not reboot and BSOD (was a case few months ago)
I was just about to replace my gpu (4090 at that!), I had them 3 times a session. I did sink a lot of hours to debug that (replaced cables, switched PSUs between desktops) and just gave up. After few weeks, lo and behold, a patch comes out and it all disappears.
A lot of people just repeat hearsay about the game
It's basically an Internet fable at this point that there's "a game that physically damages your hardware".
The answer to every such claim is just: no. But it's click bait gold to the brain damage outrage YouTuber brigade.
Accidentally using a ton of resources might e reveal weaknesses, but it is absolutely not any software vendors problem that 100% load might reveal your thermal paste application sucked or Nvidia is skimping on cable load balancing.
Trust me, I'm a software developer with more than two decades of experience. Have been dabbling in hardware since the Amiga 500 era. "I have that specific set of skills" that allows me to narrow down a class of issues pretty well - just a lot of component switching in a binary divide and conquer fashion across hardware.
The issue is 1) actually exaggarated in the community, but not without actual substance 2) getting disregarded exactly because of exaggarations. It was a very real thing.
I also happen to have a multi gpu workstation that works flawlessly too
This was pretty much my take as well. I have an older CPU, Motherboard and GPU combo before the newer GPU power cables that obviously weren't tested properly and I have no problems with stability.
These guys are running an intensive game on the highest difficulty, while streaming and they probably have a bunch of browser windows and other software running background. Any weakness in the system is going to be revealed.
I had performance issues during that time and I had to restart game every 5 matches. But it takes like a minute to restart the game.
I love Helldivers 2, but from what I can tell it's a bunch of enthusiasts using a relatively broken engine to try to do cool stuff. It almost reminds me of the first pokemon game. I'll bet there's all sorts of stuff they get wrong from a strictly technical standpoint. I love the game so much I see this more as a charming quirk than I do something which really deserves criticism. The team never really expected their game to be as popular as it's become, and I think we're still inheriting flaws from the surprise interest in the game. (some of this plays out in the tug of war between the dev team's hopes for a realistic grunt fantasy vs. and the player base's horde power fantasy.)
The game is often broken but they’ve nailed the physics-ey feel so hard that it’s a defining feature of the game.
When an orbital precision strike reflects off the hull of a factory strider and kills your friend, or eagle one splatters a gunship, or you get ragdolled for like 150m down a huge hill and then a devastator kills you with an impassionate stomp.
Those moments elevate the game and make it so memorable and replayable. It feels like something whacky and new is around every corner. Playing on PS5 I’ve been blessed with hardly any game-breaking bugs or performance issues, but my PC friends have definitely been frustrated at times
All other games from the same studio have the same features.
In fact, the whole point of their games is that they are coop games where is easy to accidentally kill your allies in hilarious manners. It is the reason for example why to cast stratagems you use complex key sequences, it is intentional so that you can make mistake and cast the wrong thing.
It's actually a really nice spell casting system. It lets you have a ton of different spells with only 4 buttons. It rewards memorizing the most useful (like reinforce). It gives a way for things like the squid disruptor fields or whatever they're called to mess with your muscle memory while still allowing spells. It would be way less interesting if it was just using spell slots like so many other games.
The only wrong thing I've been throwing is the SOS Beacon instead of a Reinforce, which is just annoying, and not just once. It makes the game public if it was friends-only and gives it priority in the quick play queue. So that can't be it.
The dialing adds friction to tense situations, which is okay as a mechanic.
I think it has the best explosions in any game I've played too. They're so dang punchy. Combined with their atmospheric effects (fog and dust and whatnot) frantic firefights with bots look fantastic.
It's such a janky game. Definitely feels like it was built using the wrong tool for the job. Movement will get stuck on the most basic things. Climbing and moving over obstacles is always a yucky feeling.
A lot of people in the comments here don't seem to understand that it is a relatively small game company with an outdated engine. I am a lot more forgiving of smaller organisations when they make mistakes.
The game has semi-regular patches where they seem to fix some things and break others.
The game has a lot of hidden mechanics that isn't obvious from the tutorial e.g. many weapons have different fire modes, fire rates and stealth is an option in the game. The game has a decent community and people friendly for the most part, it also has the "feature" of being able to be played for about 20-40 minutes and you can just put it down again for a bit and come back.
The bad tutorial at least has some narrative justification. It's just a filter for people who are already useful as shock troops with minimal training.
Not only does the bad tutorial have an in-universe justification; the ways in which it is bad are actually significant to the worldbuilding in multiple ways.
The missing information also encourages positive interactions among the community - newer players are expected to be missing lots of key information, so teaching them is a natural and encouraged element of gameplay.
I stopped playing the game awhile ago, but the tutorial always struck me as really clever.
I also think that the tutorial would be tedious if it went through too much of the mechanics. They show you the basics, the rest you pick up through trial and error.
considering it still cost 40$ for a 2 year old game, i think they are way beyond the excuse of small team low budget trying to make cool stuff. They have receive shit tons of money and are way to late trying to optimise the game. When it came out it ran so pisspoor i shelved it for a long time. Trying it recently its only marginally better. its really poorly optimised, and blaming old tech is nonsense.
People make much more smooth and complex experiences in old engines.
You need to know your engine as a dev and dont cross its limits at the costs of user-experiences and then blame your tools....
The whole story about more data making load times better is utter rubbish. Its a sign of pisspoor resource management and usage. For the game they have, they should have realized a 130GB install is unacceptable. It's not like they have very elaborate environments. A lot of similar textures and structures everywhere.. its not like its some huge unique world like The Witcher or such games...
There is an astronomical amount of information available for free on how to optimise game engines, loads of books, articles, courses.
How much money do you think they have made so far?
"Arrowhead Game Studios' revenue saw a massive surge due to Helldivers 2, reporting around $100 million in turnover and $76 million in profit for the year leading up to mid-2025, significantly increasing its valuation and attracting a 15.75% investment from Tencent"
75 million in profit but can't figure out how to optimise a game engine. get out.
The most recent Battlefield released at $80. Arc Raiders released at $40 with a $20 deluxe edition upgrade. I think $40 for a game like Helldivers 2 is totally fair. It's a fun game, worth at least 4 to 8 hours of playtime.
It's a comment about cost-to-hourly-entertainment. eg: if in the general sense you're spending $5-$10 per hour of entertainment you're doing at least OK. I understand that a lot of books and video games can far exceed this, but it's just a general metric and a bit of a low bar to clear. (I have a LOT more hours into the game so from my perspective my $40 has paid quite well.)
You cast spells in a similar way as calling in strategems in hd2.
The spell system was super neat. There’s several different elements (fire, air, water, earth, electricity, ice, ands maybe something else. It’s been a while since I played). Each element can be used on its own or is combinable. Different combinations would cast different spells. Fire+water makes steam for instance. Ice + air is a focused blizzard, etc.
there’s hundreds to learn and that’s your main weapon in the game. There’s even a spell you can cast that will randomly kick someone you’re playing with out of the game.
It’s great fun with friends, but can be annoying to play sometimes. If you try it, go with kb/m. It supports controller, but is way more difficult to build the spells.
Water, Life, Arcane, Shield, Lightning, Cold, Fire, and Earth. [0] It's worth noting that, though you can combine most of the elements to form new spells (and with compounding effects, for example wetting or steaming an enemy enhances lightning damage), you cannot typically combine opposites like lightning/ground, which will instead cancel out. Killed myself many times trying to cast lightning spells while sopping wet.
In my experience, though, nobody used the element names—my friends and I just referred to them by their keybinds. QFASA, anyone?
This is the most Helldivers 2 part for me. Spells being intentionally tricky to execute, combined with accidental element interactions and "friendly fire."
Oh my, I loved that game! It's wild everyone's throwing shade at Helldivers whilst ignoring that it was an massive success because of how fun it is. I've said it before, Dev's are really bad at understanding the art of making Fun experiences.
This would make sense if it was a studio without experience, and without any external help, but their publisher is Sony Interactive Entertainment, which also provides development help when needed, especially optimizations and especially for PS hardware. SIE seems to have been deeply involved with Helldivers 2, doubling the budget and doubling the total development time. Obviously it was a good choice by SIE, it paid off, and of course there is always 100s of more important tasks to do before launching a game, but your comment reads like these sort of problems were to be expected because the team started out small and inexperienced or something.
>but your comment reads like these sort of problems were to be expected because the team started out small and inexperienced or something.
More or less nothing is optimized these days, and game prices and budgets have gone through the roof. Compared to the other games available these days (combined with how fun the game is) I definitely give HD2 a big pass on a lot of stuff. I'm honestly skeptical of Sony's involvement being a benefit, but that's mostly due to my experience regarding their attempts to stuff a PSN ID requirement into HD2 as well as their general handling of their IPs. (Horizon Zero Dawn is not only terrible, but they seem to try to force interest with a new remake on a monthly basis.)
Not true, lots of games are optimized, but it's one of those tasks that almost no one notices when you do it great, but everyone notices when it's missing, so it's really hard to tell by just consuming ("playing") games.
> I'm honestly skeptical of Sony's involvement being a benefit
I'm not, SIE have amazing engineers, probably the best in the industry, and if you have access to those kind of resources, you use it. Meanwhile, I agree that executives at Sony sometimes have no clue, but that doesn't mean SIE helping you with development suddenly has a negative impact on you.
BF6 comes to mind, out of newly released games. Arc Raiders too, seems to have avoided the heap of criticism because of performance, meaning it is probably optimized enough so people don't notice issues. Dyson Sphere Program (yet to be released) is a bit older, and indie, but very well optimized.
Thanks for the list -- now that you mention it, I recall being quite surprised to learn that Arc Raiders was not only an UE5 game but would also run nicely on my PC. (I haven't played it, but a friend asked me to consider it) Now that you mention it as well, I think I recall the BF6 folks talking specifically about not cramming too many graphical techniques into their games so that people could actually play the game.
> I recall being quite surprised to learn that Arc Raiders was not only an UE5 game but would also run nicely on my PC
Yeah, Unreal Engine (5 almost specifically) is another example of things that are unoptimized by default, very easy to notice, but once you work on it, it becomes invisible and it's not that people suddenly cheer, you just don't hear complaints about it.
It's also one of those platforms where there is a ton of help available from Epic if you really want it, so you can tune the defaults BEFORE you launch your game, but hardly anyone seemingly does that, and then both developers and users blame the engine, instead of blaming the people releasing the game. It's a weird affair all around :)
No Man's Sky didn't have technical issues at launch though, it ran fine for what is was. The problem with NMS was that people were told it would be a completely different experience compared to what it ended up being (at launch).
The game logic is also weird. It seems like they started with at attempt at a realistic combat simulator which then had lots of unrealistic mechanics added on top in an attempt to wrangle it into an enjoyable game.
As an example for overly realistic physics, projectile damage is affected by projectile velocity, which is affected by weapon velocity. IIRC, at some point whether you were able to destroy some target in two shots of a Quasar Cannon or three shots depended on if you were walking backwards while you were firing, or not.
> depended on if you were walking backwards while you were firing
That sounds like a bug, not an intentional game design choice about the game logic, and definitely unrelated to realism vs not realism. Having either of those as goals would lead to "yeah, bullet velocity goes up when you go backwards" being an intentional mechanic.
To be clear, walking backwards (away from the target) reduced your bullet velocity relative to the target, reducing the damage you were doing and leading to you needing more shots.
You put the nail on the head with the first Pokémon, but Helldivers 2 is an order of magnitude smaller in the amateur-to-success ratio.
Game Freak could not finish the project, so they had to be bailed by Nintendo with an easy-to-program game so the company could get some much needed cash (the Yoshi puzzle game on NES). Then years later, with no end to the game in sight, Game Freak had to stoop to contracting Creatures inc. to finish the game. Since they had no cash, Creatures inc. was paid with a portion of the Pokémon franchise.
Pokémon was a shit show of epic proportions. If it had been an SNES game it would have been canceled and Game Freak would have closed. The low development cost of Game Boy and the long life of the console made Pokémon possible.
My takeaway is that it seems like they did NO benchmarking of their own before choosing to do all that duplication. They only talk about performance tradeoff now that they are removing it. Wild
It's an valid issue, those of us who worked back in the day on GD/DVD,etc games really ran into bad loading walls if we didn't duplicate data for straight streaming.
Data-sizes has continued to grow and HDD-seek times haven't gotten better due to physics (even if streaming probably has kept up), the assumption isn't too bad considering history.
It's a good that they actually revisited it _when they had time_ because launching a game, especially a multiplayer one, will run into a lot of breaking bugs and this (while a big one, pun intended) is still by most classifications a lower priority issue.
I've been involved in decisions like this that seem stupid and obvious. There's a million different things that could/should be fixed, and unless you're monitoring this proactively you're unlikely to know it hsould be changed.
I'm not an arrowhead employee, but my guess is at some point in the past, they benchmarked it, got a result, and went with it. And that's about all there is to it.
Performance profiling should be built into the engine and turned on at all times. Then this telemetry could be streamed into a system that tracks it across all builds, down to a specific scene. It should be possible to click a link on the telemetry server and start the game at that exact point.
They admitted to testing nothing, they just [googled it].
To be fair, the massive install size was probably the least of the problems with the game, it's performance has been atrocious, and when they released for xbox, the update that came with it broke the game entirely for me and was unplayable for a few weeks until they released another update.
In their defense, they seem to have been listening to players and have been slowly but steadily improving things.
Playing Helldivers 2 is a social thing for me where I get together online with some close friends and family a few times a month and we play some helldivers and have a chat, aside from that period where I couldn't play because it was broken, it's been a pretty good experience playing it on Linux; even better since I switched from nvidia to AMD just over a week ago.
I'm glad they reduced the install size and saved me ~130GB, and I only had to download about another 20GB to do it.
>These loading time projections were based on industry data - comparing the loading times between SSD and HDD users where data duplication was and was not used. In the worst cases, a 5x difference was reported between instances that used duplication and those that did not. We were being very conservative and doubled that projection again to account for unknown unknowns.
>We now know that, contrary to most games, the majority of the loading time in HELLDIVERS 2 is due to level-generation rather than asset loading. This level generation happens in parallel with loading assets from the disk and so is the main determining factor of the loading time. We now know that this is true even for users with mechanical HDDs.
they did absolutely zero benchmarking beforehand, just went with industry haresay, and decided to double it just in case.
Nowhere in that does it say “we did zero benchmarking and just went with hearsay”. Basing things on industry data is solid - looking at the steam hardware surveys if a good way to figure out the variety of hardware used without commissioning your own reports. Tech choices are no different.
Do you benchmark every single decision you make on every system on every project you work on? Do you check that redis operation is actually O(1) or do you rely on hearsay. Do you benchmark every single SQL query, every DTO, the overhead of the DI Framework, connection pooler, json serializer, log formatter? Do you ever rely on your own knowledge without verifying the assumptions? Of course you do - you’re human and we have to make some baseline assumptions, and sometimes they’re wrong.
They made a decision based on existing data. This isn't unreasonable as you are pretending, especially as PC hardware can be quite diverse.
You will be surprised what some people are playing games on. e.g. I know people that still use Windows 7 on a AMD BullDozer rig. Atypical for sure, but not unheard of.
My PC now is 6 years old and I have no intention of upgrading it soon. My laptop is like 8 years old and it is fine for what I use it for. My monitors are like 10-12 years old (they are early 4k monitors) and they are still good enough. I am primarily using Linux now and the machine will probably last me to 2030 if not longer.
Pretending that this is an outrageous decision when the data and the commonly assumed wisdom was that there were still a lot of people using HDDs.
They've since rectified this particular issue and there seems to be more criticism of the company after fixing an issue.
It was a real issue in the past with hard drives and small media assets. It's still a real issue even with SSDs. HDD/SSD IOPS are still way slower than contiguous reads when you're dealing with a massive amount of files.
At the end of the day it requires testing which requires time at a time you don't have a lot of time.
This is not a good invokation of Chesterton's Fence.
The Fence is a parable about understanding something that already exists before asking to remove it. If you cannot explain why it exists, you shouldn't ask to remove it.
In this case, it wasn't something that already existed in their game. It was something that they read, then followed (without truly understanding whether it applied to their game), and upon re-testing some time later, realized it wasn't needed and caused detrimental side-effects. So it's not Chesterton's Fence.
You could argue they followed a videogame industry practice to make a new product, which is reasonable. They just didn't question or test their assumptions that they were within the parameters of said industry practice.
I don't think it's a terrible sin, mind you. We all take shortcuts sometimes.
It's not an issue with asynchronous filesystem IO. Again, async file IO should be the default for game engines. It doesn't take a genius to gather a list of assets to load and then wait for the whole list to finish rather than blocking on every tiny file.
There are two different things when talking about application behavior versus disk behavior.
>wait for the whole list to finish rather than blocking on every tiny file.
And this is the point. I can make a test that shows exactly what's going on here. Make a random file generator that generates 100,000 4k files. Now, write them on hard drive with other data and things going on at the same time. Now in another run of the program have it generate 100,000 4k files and put them in a zip.
Now, read the set of 100k files from disk and at the same time read the 100k files in a zip....
One finishes in less than a second and one takes anywhere from a few seconds to a few minutes depending on your disk speeds.
"Industry hearsay" in this case was probably Sony telling game devs how awesome the PS5's custom SSD was gonna be, and nobody bothered to check their claims.
HD2 started as playstation exclusive, and was retargeted mid-development for simultaneous release.
So the PS5's SSD architecture was what developers were familiar with when they tried to figure out what changes would be needed to make the game work on PC.
If what they were familiar with was a good SSD, then they didn't need to do anything. I don't see how anything Sony said about their SSD would have affected things.
Maybe you're saying the hearsay was Sony exaggerating how bad hard drives are? But they didn't really do that, and the devs would already have experience with hard drives.
I don't really understand your point. You're making a very definitive statement about how the PS5's SSD architecture is responsible for this issue - when the isssue is on a totally different platform, where they have _already_ attempted (poorly, granted) to handle the different architectures.
> our worst case projections did not come to pass. These loading time projections were based on industry data - comparing the loading times between SSD and HDD users where data duplication was and was not used. In the worst cases, a 5x difference was reported between instances that used duplication and those that did not. We were being very conservative and doubled that projection again to account for unknown unknowns.
> The pop-culture cargo cult description, however, takes features of some cargo cults (the occasional runway) and combines this with movie scenes to yield an inaccurate and fictionalized dscription. It may be hard to believe that the description of cargo cults that you see on the internet is mostly wrong, but in the remainder of this article, I will explain this in detail.
On the flip side I don't remember who did it, but basically extracting textures on disk fixed all the performance issues UE5 has on some benchmarks(sorry for being vague, but I can't find the source material right now). But their assumption is in fact a sound one.
Yes. Its quite common for games to have mods that repack textures or significantly tweak the UE5 config at the moment - and its very common to see users using it when it doesn't actually affect their use cases.
As an aside, I do enjoy the modding community naming over multiple iterations of mods - "better loading" -> "better better loading" -> "best loading" -> "simplified loading" -> "x's simplified loading" -> "y's simplified loading" -> "z's better simplified loading". Where 'better' is often some undisclosed metric based on some untested assumptions.
It's pretty standard to do that duplication for games on CD/DVD because seek times are so long. It probably just got carried over as the "obviously correct" way of doing things, since HDDs are like DVDs if you squint a bit
You can't bench your finished game before it exists and you don't really want to rock the boat late in dev, either.
It was a fundamentally sound default that they revisited. Then they blogged about the relatively surprising difference it happen to make in their particular game. As it turns out the loading is CPU bound anyway, so while the setting is doing it's job, in the context of the final game, it happens to not be the bottle neck.
There's also the movement away from HDD and disc drives in the player base to make that the case as well.
It's very easy to accidentally get misleading benchmarking results in 100 different ways, I wouldn't assume they did no benchmarking when they did the duplication.
The good old "studios don't play their own games" strikes again :P
Games would be much better if all people making them were forced to spend a few days each month playing the game on middle-of-the-road hardware. That will quickly teach them the value of fixing stuff like this and optimising the game in general.
I've worked in games for close to 15 years, and every studio I've worked on we've played the game very regularly. My current team every person plays the game at least once a week, and more often as we get closer to builds.
In my last project, the gameplay team played every single day.
> Games would be much better if all people making them were forced to spend a few days each month playing the game on middle-of-the-road hardware
How would playing on middle of the road hardware have caught this? The fix to this was to benchmark the load time on the absolute bottom end of hardware, with and without the duplicated logic. Which you'd only do once you have a suspicion that it's going to be faster if you change it...
They could have been lying I guess but I listened to a great podcast about the development of Helldivers 2 (I think it was gamemakers notebook) and one thing that was constantly brought up was as they iterated they forced a huge chunk of the team to sit down and play it. That’s how things like diving from a little bit too high ended up with you faceplanting and rag-dolling, tripping when jet packing over a boulder that you get a little too close to, etc. They found that making it comically realistic in some areas led to more unexpected/emergent gameplay that was way more entertaining. Turrets and such not caring if you’re in the line of fire was brought up I believe.
That’s how we wound up with this game where your friends are as much of a liability as your enemies.
They used industry data to make the decision first to avoid potential multi minute load times for 10% or do of their players, hard to test all kinds of pc configurations. Now they have telemetry showing that it doesn't matter because another parallel task takes about as much time anyway.
Maybe it's changed a lot statistically in the last few years but for long time PC gamers used to have the mantra of small SSD for the OS and large HDD for games if they're price conscious so I could see that being assumed to be much more normal during development.
So in the worst case when everything is loaded at once (how on a system with < 32Gb RAM?) it takes 4 minutes.
Considering GTA whatever version could sit for 15 minutes at the loading screen because nobody bothered to check why - the industry could really say not to bother.
I was curious if they optimized the download. Did it download the 'optimized' ~150 GB and wasting a lot of time there or did it download the ~20 GB unique data and duplicated as part of the installation.
I still don't know but found instead an interesting reddit post were users found and analyzed this "waste of space" three month ago.
WebDevs who have build systems that take ten minutes and download tens of megabytes of JS and have hundreds of milliseconds of lag are sooooooooooooo not allowed to complain about game devs ever.
Oh, at first I thought you were talking about websites doing that and I was going to say "sure, those people can't complain, but the rest of us can".
Then I realized you said build systems and eh, whatever. It's not good for build systems to be bloated, but it matters a lot less than the end product being bloated.
And you seem to be complaining about the people that are dealing with these build systems themselves, not inflicting them on other people? Why don't they get to complain?
Download bloat is net less impactful than build time bloat imho. Game download and install size bloat is bad. But is a mostly one time cost. Build time bloat doesn’t directly impact users, but iteration time is GodKing so bad build times indirectly hurt consumers.
But that’s all beside the point. What I was really doing was criticizing the <waves hands wildly> HN commenters. HN posters are mostly webdevs because most modern programmers are webdevs. And while I won’t say the file bloat here wasn’t silly, I wonder stand for game dev slander from devs that commit faaaaaaaaaaaaaar greater sins.
This isn't unique to games, and it's not just "today". Go back a decade [0] find people making similar observations about one of the largest tech companies on the planet.
> That being said, cartridges were fast. The move away from cartridges was a wrong turn
Cartridges were also crazy expensive. A N64 cartridge cost about $30 to manufacture with a capacity of 8MB, whereas a PS1 CD-ROM was closer to a $1 manufacturing cost, with a capacity of 700MB. That's $3.75/MB versus $0.0014/MB - over 2600x more expensive!
Without optical media most games from the late 90s & 2000s would've been impossible to make - especially once it got to the DVD era.
I hate it when you buy a physical game, insert the disk, and immediately have to download the game in order to play the game because the disk only contains a launcher and a key. Insanity of the worst kind.
Nintendo is pretty good for putting a solid 1.0 version of their games on the cartridges on release. But on the other hand, the Switch cartridges use NAND memory which means if you aren't popping them into a system to refresh the charge every once in a while, your physical cartridge might not last as long as they keep the servers online so you could download a digital purchase.
I've kinda given up on physical games at this point. I held on for a long time, but the experience is just so bad now. They use the cheapest, flimsiest, most fragile plastic in the cases. You don't get a nice instruction manual anymore. And honestly, keeping a micro SD card in your system that can hold a handful of games is more convenient than having to haul around a bunch of cartridges that can be lost.
I take solace in knowing that if I do still have a working Switch in 20 years and lose access to games I bought a long time ago, hopefully the hackers/pirates will have a method for me to play them again.
> the Switch cartridges use NAND memory which means if you aren't popping them into a system to refresh the charge every once in a while, your physical cartridge might not last as long
You've been paying attention to the wrong sources for information about NAND flash. A new Switch cartridge will have many years of reliable data retention, even just sitting on a shelf. Data retention only starts to become a concern for SSDs that have used up most of their write endurance; a Switch cartridge is mostly treated as ROM and only written to once.
The read speed off of an 8xDVD is ~10MB/s. The cheapest 500GB SSD on Amazon has a read speed of of 500MB/s. An NVMe drive has is 2500MB/s. We can read an entire DVD's capacity (4.7GB) from an SSD in under 10 seconds, compared to 8 minutes.
They do, but it's irrelevant to performance nowadays since you're required to install all of the disc data to the SSD before you can play. The PS3/360 generation was the last time you could play games directly from a disc (and even then some games had an install process).
I'm glad they've been able to do this, looks like a huge improvement for HD2 on PC.
I've been on PS5 since launch and aside from Baldur's Gate 3, it's been the best game this gen IMO.
The negativity I see towards the game (especially on Youtube) is weird. Some of the critiques seem legit but a lot of feels like rage bait, which appears to be a lot of YT videos around gaming lately.
Anyway, a big improvement for a great game. Seems like less of an incentive now to uninstall if you only play now and then.
Yes, but those are rarely a thing for most live service games. Unless someone is working on a reimplementation of the entire server side, there's no point in offering or downloading pirate copies.
If this article was exciting for you, I also highly recommend this one. A random dude fixed a bug in GTA 5 that was the root cause of it loading insanely slowly since the game came out!
Pretty cool. I think it’s completely normal to be under a crunch and just go with some standard practices under normal conditions. Cool that they went back and sorted it out afterwards!
I’ve got to say. I do find it somewhat unusual that despite the fact that every HN engineer has John Carmack level focus on craftsmanship, about 1/100k here produce that kind of outcome.
I don’t get it. All of you guys are good at pointing out how to do good engineering. Why don’t you make good things?
I don't think this is the real explanation. If they gave the filesystem a list of files to fetch in parallel (async file IO), the concept of "seek time" would become almost meaningless. This optimization will make fetching from both HDDs and SSDs faster. They would be going out of their way to make their product worse for no reason.
Solid state drives tend to respond well to parallel reads, so it's not so clear. If you're reading one at a time, sequential access is going to be better though.
But for a mechanical drive, you'll get much better throughput on sequential reads than random reads, even with command queuing. I think earlier discussion showed it wasn't very effective in this case and taking 6x the space for a marginal benefit for the small % of users with mechanical drives isn't worth while...
Every storage medium, including ram, benefits from sequential access. But it doesn't have to be super long sequential access, the seek time, or block open time just needs to be short relative to the next block read.
If they fill your harddrive youre less likely to install other games. If you see a huge install size youre less likely to uninstall with plans to reinstall later because thatd take a long time.
>If they gave the filesystem a list of files to fetch in parallel (async file IO)
This does not work if you're doing tons of small IO and you want something fast.
Lets say were on a HDD with 200IOPS and we need to read 3000 small files randomly across the hard drive.
Well, at minimum this is going to take 15's seconds plus any additional seek time.
Now, lets say we zip up those files in a solid archive. You'll read it in half a second. The problem comes in when different levels all need different 3000 files. Then you end deduping a bunch of stuff.
Now, where this typically falls apart for modern game assets is they are getting very large which tends to negate seek times by a lot.
I haven't found any asynchronous IOPS numbers on HDDS anywhere. The internet IOPs are just 1000ms/seek time with a 8ms seek time for moving from the outer to the inner track, which is only really relevant for the synchronous file IO case.
For asynchronous IO you can just do inward/outward passes to amortize the seek time over multiple files.
While it may not have been obvious, I have taken archiving or bundling of assets into a bigger file for granted. The obvious benefit is that the HDD knows that it should store game files continuously. This has nothing to do with file duplication though and is a somewhat irrelevant topic, because it costs nothing and only has benefits.
The asynchronous file IO case for bundled files is even better, since you can just hand over the internal file offsets to the async file IO operations and get all the relevant data in parallel so your only constraint is deciding on an optimal lower bound for the block size, which is high for HDDs and low for SSDs.
>I haven't found any asynchronous IOPS numbers on HDDS anywhere. The internet IOPs are just 1000ms/seek time with a 8ms seek time for moving from the outer to the inner track, which is only really relevant for the synchronous file IO case.
>For asynchronous IO you can just do inward/outward passes to amortize the seek time over multiple files.
Here's a random blog post that has benchmarks for a 2015 HDD:
It shows 1.5MB/s for random 4K performance with high queue depth, which works out to just under 400 IOPS. 1 queue depth (so synchronous) performance is around a third.
>I haven't found any asynchronous IOPS numbers on HDDS anywhere.
As the other user stated, just look up Crystal Disk Info results for both HDDs and SSD and you'll see hard drives do about 1/3rd of a MBPs on random file IO while the same hard drive will do 400MBps on a contiguous read. For things like this reading a zip and decompressing in memory is "typically" (again, you have to test this) orders of magnitude faster.
No, not at all. But by putting every asset a level (for example) needs in the same file, you can pretty much guarantee you can read it all sequentially without additional seeks.
That does force you to duplicate some assets a lot. It's also more important the slower your seeks are. This technique is perfect for disc media, since it has a fixed physical size (so wasting space on it is irrelevant) and slow seeks.
> by putting every asset a level (for example) needs in the same file, you can pretty much guarantee you can read it all sequentially
I'd love to see it analysed. Specifically, the average number of nonseq jumps vs overall size of the level. I'm sure you could avoid jumps within megabytes. But if someone ever got closer to filling up the disk in the past, the chances of contiguous gigabytes are much lower. This paper effectively says that if you have long files, there's almost guaranteed gaps https://dfrws.org/wp-content/uploads/2021/01/2021_APAC_paper... so at that point, you may be better off preallocating the individual does where eating the cost of switching between them.
From that paper, table 4, large files had an average # of fragments around 100, but a median of 4 fragments. A handful of fragments for a 1 GB level file is probably a lot less seeking than reading 1 GB of data out of a 20 GB aggregated asset database.
But it also depends on how the assets are organized, you can probably group the level specific assets into a sequential section, and maybe shared assets could be somewhat grouped so related assets are sequential.
Sure. I’ve seen people that do packaging for games measure various techniques for hard disks typical of the time, maybe a decade ago. It was definitely worth it then to duplicate some assets to avoid seeks.
Nowadays? No. Even those with hard disks will have lots more RAM and thus disk cache. And you are even guaranteed SSDs on consoles. I think in general no one tries this technique anymore.
It's an optimistic optimization so it doesn't really matter if the large blobs get broken up. The idea is that it's still better than 100k small files.
Not 'full' de-fragmentation, Microsoft labs did a study and after 64MB slabs of contiguous files you don't gain much so they don't care about getting gigabytes fully defragmented.
Not really. But when you write a large file at once (like with an installer), you'll tend to get a good amount of sequential allocation (unless your free space is highly fragmented). If you load that large file sequentially, you benefit from drive read ahead and OS read ahead --- when the file is fragmented, the OS will issue speculative reads for the next fragment automatically and hide some of the latency.
If you break it up into smaller files, those are likely to be allocated all over the disk; plus you'll have delays on reading because windows defender makes opening files slow. If you have a single large file that contains all resources, even if that file is mostly sequential, there will be sections that you don't need, and read ahead cache may work against you, as it will tend to read things you don't need.
Their concern was that one person in a squad loading on HDD could slow down the level loading for all players in a squad, even if they used a SSD, so they used a very normal and time-tested optimisation technique to prevent that.
I assume asset reads nowadays are much heavier than 4 kB though, specially if assets meant to be loaded together are bundled together in one file. So games now should be spending less time seeking relative to their total read size. Combined with HDD caches and parallel reads, this practice of duplicating over 100 GBs across bundles is most likely a cargo-cult by now.
Which makes me think: Has there been any advances in disk scheduling in the last decade?
The idea is to duplicate assets so loading a "level" is just sequential reading from the file system. It's required on optical media and can be very useful on spinning disks too. On SSDs it's insane. The logic should've been the other way around. Do a speed test on start an offer to "optimise for spinning media" if the performance metrics look like it would help.
If the game was ~20GB instead of ~150GB almost no player with the required CPU+GPU+RAM combination would be forced to put it on a HDD instead of a SSD.
This idea of one continuous block per level dates back to the PS1 days.
Hard drives are much, much faster than optical media - on the order of 80 seeks per second and 300 MB/s sequential versus, like, 4 seeks per second and 60 MB/s sequential (for DVD-ROM).
You still want to load sequential blocks as much as possible, but you can afford to have a few. (Assuming a traditional engine design, no megatextures etc) you probably want to load each texture from a separate file, but you can certainly afford to load a block of grass textures, a block of snow textures, etc. Also throughput is 1000x higher than a PS1 (300 kB/s) so you can presumably afford to skip parts of your sequential runs.
This seems great to me. Am I crazy? This feels like it should be Hacker News's bread and butter, articles about "we moved away from Kubernetes/microservices/node.js/serverless/React because we did our own investigation and found that the upsides aren't worth the downsides" tend to do really well here. How is this received so differently?
If it had been more well known this was the cause of game bloat before then this probably would have been better received. Still, Arrowhead deserves more credit both for testing and breaking the norm as well as making it a popular topic.
154GB vs 23GB can trivially make the difference of whether the game can be installed on a nice NVMe drive.
Is there a name for the solution to a problem (make size big to help when installed on HDD) in fact being the cause of the problem (game installed on HDD because big) in the first place?
I thought all required ssd's now for "normal" gameplay.
At 200 MB/s the way hard drives usually measure it, you're able to read up to 390,625 512-byte blocks in 1 second, or to put it another way, a block that's immediately available under the head can be read in 2.56 microseconds. On the other hand, at 7200 RPM, it takes up to 8.33 milliseconds to wait for the platter to spin around and reach a random block on the same track. Even if these were the only constraints, sequentially arranging data you know you'll need to have available at the same time cuts latency by a factor of about 3000.
It's much harder to find precise information about the speed of the head arm, but it also usually takes several milliseconds to move from the innermost track to the outermost track or vice versa. In the worst case, this would double the random seek time, since the platter has to spin around again because the head wasn't in position yet. Also, since hard drives are so large nowadays, the file system allocators actually tend to avoid fragmentation upfront, leading to generally having few fragments for large files (YMMV).
So, the latency on a hard drive can be tolerable when optimized for.
You did the math for 7200 rotations per second, not 7200 rotations per minute = 120 rotations per second.
In gaming terms, you get at most one or two disk reads per frame, which effectively means everything has to be carefully prefetched well in advance of being needed. Whereas on a decade-old SATA SSD you get at least dozens of random reads per frame.
https://partner.steamgames.com/doc/sdk/uploading#AppStructur...
But over 6x the size with so little benefit for such a small segment of the players is very frustrating. Why wasn't this caught earlier? Why didn't anyone test? Why didn't anyone weigh the pros and cons?
It's kind of exemplary of HD2's technical state in general - which is a mix of poor performance and bugs. There was a period where almost every other mission became impossible to complete because it was bugged.
The negativity is frustration boiling over from years of a bad technical state for the game.
I do appreciate them making the right choice now though, of course.
Helldivers II was also much smaller at launch than it is now. It was almost certainly a good choice at launch.
Then they just didn't reconsider the choice until, well, now.
Have you never worked in an organization that made software?
Damn near everything can be 10x as fast and using 1/10th the resources if someone bothered to take the time to find the optimizations. RARE is it that something is even in the same order of magnitude as its optimum implementation.
Reverting it now though, when the game is out there on a million systems, requires significant investigation to ensure they're not making things significantly worse for anyone, plus a lot of testing to make sure it doesn't outright break stuff.
Given that, people need to accept higher costs, longer development times, or reduced scope if they want better optimized games.
But what is worse, is just trying to optimize software is not the same as successfully optimizing it. So time and money spent on optimization might yield no results because there might not be anymore efficiency to be gained, the person doing the work lacks the technical skill, the gains are part of a tradeoff that cannot be justified, or the person doing the work can't make a change (i.e., a 3rd party library is the problem).
The lack of technical skill is a big one, IMO. I'm personally terrible at optimizing code, but I'm pretty good at building functional software in a short amount of time. We have a person on our team who is really good at it and sometimes he'll come in after me to optimize work that I've done. But he'll spend several multiples of the time I took making it work and hammering out edge cases. Sometimes the savings is worth it.
God why can’t it just be longer development time. I’m sick of the premature fetuses of games.
The reason games are typically released as "fetuses" is because it reduces the financial risk. Much like any product, you want to get it to market as soon as is sensible in order to see if it's worth continuing to spend time and money on it.
Where do you stop? What do the 5 tech designers do while the 2 engine programmers optimise every last byte of network traffic?
> I’m sick of the premature fetuses of games.
Come on, keep this sort of crap off here. Games being janky isn't new - look at old console games and they're basically duct taped together. Go back to Half-life 1 in 1998 - the Xen world is complete and utter trash. Go back farther and you have stuff that's literally unplayable [0], or things that were so bad they literally destroyed an entire industry [1], or rendered the game uncompleteable [2].
[0] https://en.wikipedia.org/wiki/Dr._Jekyll_and_Mr._Hyde_(video... [1] https://www.theguardian.com/film/2015/jan/30/a-golden-shinin... [2] https://www.reddit.com/r/gamecollecting/comments/hv63ad/comm...
One of the highest rated games ever released without devs turning on the "make it faster" button which would have required approximately zero effort and had zero downsides.
This kind of stuff happens because the end result A vs. B doesn't make that much of a difference.
And it's very hard to have a culture of quality that doesn't get overrun by zealots who will bankrupt you while they squeeze the last 0.001% of performance out of your product before releasing. It is very had to have a culture of quality that does the important things and doesn't do the unimportant ones.
The people who obsess with quality go bankrupt and the people who obsess with releasing make money. So that's what we get.
A very fine ability for evaluating quality mixed with pragmatic choice for what and when to spend time on it is rare.
I don't see why it's a surprise that people react "negatively", in the sense of being mad that (a) Helldivers 2 was intentionally screwing the customers before, and (b) everyone else is still doing it.
When the details of exactly why the game was so large came out, many people felt this was a sort of customer betrayal, The publisher was burning a large part of the volume of your precious high speed sdd for a feature that added nothing to the game.
People probably feel the same about this, why were they so disrespectful of our space and bandwidth in the first place? But I agree it is very nice that they wrote up the details in this instance.
This doesn't even pass the sniff test. The files would just be compressed for distribution and decompressed on download. Pirated games are well known for having "custom" installers.
All Steam downloads are automatically compressed. It's also irrelevant. The point is that playback of uncompressed audio is indeed cheaper than playback of compressed audio.
Even when Titanfall 2 was released in 2016, I don't think that was meaningfully the case. Audio compression formats have been tuned heavily for efficient playback.
Software developers of all kinds (not just game publishers) have a long and rich history of treating their users' compute resources as expendable. "Oh, users can just get more memory, it's cheap!" "Oh, xxxGB is such a small hard drive these days, users can get a bigger one!" "Oh, most users have Pentiums by now, we can drop 486 support!" Over and over we've seen companies choose to throw their users under the bus so that they can cheap out on optimizing their product.
It seems no one takes pride in their piracy anymore.
The game programming was actually just as research focused and involved as the actual research. They were trying to figure out how to get the lowest latency and consistency for impact sounds.
also keep in mind that modern gaming generates more revenue than the movie industry, so it's in the interests of several different parties to denigrate or undermine any competing achievement -- "Bots Rule Every Thing Around Me"
I mean I agree with you, that it is trendy and seemingly easy, to shit on other people's work, and at this point it seems to be a challenge people take up upon themselves, to criticise something in the most flowery and graphic way as possible, hoping to score those sweet internet points.
Since maybe 6-7 years I stopped reading reviews and opinions about newly launched games completely, the internet audience (and reviewers) are just so far off base compared to my own perspective and experience that it have become less than useless, it's just noise at this point.
Really, the different factions in software development are a fascinating topic to explore. Add embedded to the discussion, and you could probably start fights in ways that flat out don't make sense.
I won’t state my own personal views here, but for those that share the above perspective, there is little benefit of the doubt they’ll extend towards Arrowhead.
Game studio's no longer care how big their games are if steam will still take them. This is a huge problem. GTA5 was notorious for loading json again, and again, and again during loading and it was just a mess. Same for HD2, game engines have the ability to only pack what is used but its still up to the developers to make sure their assets are reusable as to cut down on size.
This is why Star Citizen has been in development for 15 years. They couldn't optimize early and were building models and assets like it's for film. Not low poly game assets but super high poly film assets.
The anger here is real. The anger here is justified. I'm sick of having to download 100gb+ simply because a studio is too lazy and just packed up everything they made into a bundle.
Reminds me of the Crack.com interview with Jonathan Clark:
Adding to the difficulty of the task, our artist had no experience in the field. I remember in a particular level we wanted to have a dungeon. A certain artist begin by creating a single brick, then duplicating it several thousand times and building a wall out of the bricks. He kept complaining that his machine was too slow when he tried to render it. Needless to say this is not the best way to model a brick wall.
https://web.archive.org/web/20160125143707/http://www.loonyg...
GTA5 had well over 1000 people on its team.
When making a game, once you have something playable, is to figure out how to package it. This is included in that effort. Determining which assets to compress, package, and ship. Sometimes this is done by the engine. Sometimes this is done by the art director.
When I did this. My small team took a whole sprint to make sure that assets were packed. That tilemaps were made. That audio files were present and we did an audit to make sure nothing extra was packaged on disk. Today, because of digital stores and just releasing zip files, no one cares what they ship and often you can see it if you investigate the files of any Unity or Unreal engine game. Just throw it all over the fence.
Many would consider this a bare minimum rather than something worthy of praise.
The Electron debate isn't about details purism, the Electron debate is about the foundation being a pile of steaming dung.
Electron is fine for prototyping, don't get me wrong. It's an easy and fast way to ship an application, cross-platform, with minimal effort and use (almost) all features a native app can, without things like CORS, permission popups, browser extensions or god knows what else getting in your way.
But it should always be a prototype and eventually be shifted to native applications because in the end, unlike Internet Explorer in its heyday which you could trivially embed as ActiveX and it wouldn't lead to resource gobbling, if you now have ten apps consuming 1GB RAM each just for the Electron base to run, now the user runs out of memory because it's like PHP - nothing is shared.
PWAs have the problem that for every interaction with the "real world" they need browser approval. While that is for a good reason, it also messes with the expectations of the user, and some stuff such as unrestricted access to the file system isn't available to web apps at all.
It took Sony's intervention to actually pull back the game into playable state once - resulting in the so called 60 day patch.
Somehow random modders were able to fix some of the most egregiously ignored issues (like an enemy type making no sound) quickly and effectively. ArrowHead ignored, then denied, then used the "gamers bad" tactic, banned people pointing it out. After long time, finally fixing it and trying to bury it in the patch notes too.
They also have been caught straight up lying about changes, most recent one was: "Apparently we didn't touch the Coyote", where they simply buffed enemies resistance to fire, effectively nerfing the gun.
- To their PC not reboot and BSOD (was a case few months ago)
- Be able to actually finish a mission (game still crashes a lot just after extraction, it's still rare for the full team to survive 3 missions in a row)
- Be able to use weapon customisation (the game crashed, when you navigated to the page with custom paints)
- Continue to run, even when anybody else from the team was stimming (yes, any person in the team stimming caused others to slow down)
- Actually be able to hear one of the biggest enemies in the game
- To not issue stim/reload/weapon change multiple times, for them just to work (it's still normal to press stim 6 times in some cases, before it activates, without any real reason)
- Be able to use chat, when in the vehicle (this would result in using your primary weapon)
- Be able to finish drill type mission (this bugs out a lot still)
- Not be attacked by enemies that faze through buildings
- Not be attacked by bullets passing through terrain, despite the player bullets being stopped there
are just vocal player's complaints? A lot of those bugs went totally unaddressed for months. Some keep coming back in regressions. Some just are still ongoing. This is only a short list of things I came across, while casually playing. It's a rare sight to have a full OP without an issue (even mission hardlocks still).
About Sony - I specifically referred the Shams Jorjani's (CEO of ArrowHead) explanation to Hermen Hulst (the head of PlayStation Studios) why the review score collapsed to 19%, among other issues.
A lot of issues are to do with the fact that the game seems to corrupt itself. If I have issues (usually performance related), I do a steam integrity check and I have zero issues afterwards. BTW, I've had to do this on several games now, so this isn't something that is unique to HellDivers. My hardware is good BTW, I check in various utils and the drives are "ok" as far as I can tell.
> - To their PC not reboot and BSOD (was a case few months ago)
This was hyped up by a few big YouTubers. The BSODs was because their PCs were broken. One literally had a burn mark on their processor (a known issue with some boards/processor combos) and the BSODs went away when they replaced their processor. This tells me that there was something wrong with their PC and any game would have caused a BSOD.
So I am extremely sceptical of any claims of BSODs because of a game. What almost is always the case is that the OS or the hardware is at issue and playing a game will trigger the issue.
If you are experiencing BSODs I would make sure your hardware and OS are actually good, because they are probably not. BTW I haven't a BSOD in Windows for about a decade because I don't buy crap hardware.
> - Be able to actually finish a mission (game still crashes a lot just after extraction, it's still rare for the full team to survive 3 missions in a row)
False. A few months ago I played it for an entire day and the game was fine. Last week I played it a good portion of Saturday night. I'm in several large HellDivers focused Discord servers and I've not heard a lot of people complaining about it. Maybe 6 months ago or a year ago this was the case, but not now.
> Be able to use weapon customisation (the game crashed, when you navigated to the page with custom paints)
This happened for like about a week for some people and I personally didn't experience this.
> To not issue stim/reload/weapon change multiple times, for them just to work (it's still normal to press stim 6 times in some cases, before it activates, without any real reason)
I've not experience this. Not heard anyone complain about this and I am in like 4 different HellDivers focus'd discord servers
> Not be attacked by enemies that faze through buildings
This can be annoying, but it happens like once in a while. It isn't the end of the world.
Generally speaking, I am too. That is unless there is kernel-level anticheat. In that case I believe it's fair to disregard all other epistemological processes and blame BSODs on the game out of principle
I am sorry but that is asinine and unscientific. You should blame BSODs on what is causing them. I don't like kernel anti-cheat but I will blame the actual cause of the issues, not assign blame on things which I don't approve of.
I am a long time Linux user and many of the people complaining about BSODs on Windows had a broken the OS in one way or another. Some were running weird stuff like 3rd party shell extensions that modify core DLLs, or they had installed every POS shovelware/shareware crap. That isn't Microsoft's fault if you start running an unsupported configuration of the OS.
Similarly. The YouTubers that were most vocal about HellDivers problems did basically no proper investigation other than saying "look it crashed", when it was quite clearly their broken hardware that was the issue. As previously stated their CPU had a burn mark on one of the pins, some AM5 had faults that caused this IIRC. So everything indicated hardware failure being the cause of the BSOD. They still blamed the game, probably because it got them more watch time.
During the same time period when people were complaining about BSODs, I didn't experience one. I was running the same build of the game as them and playing on the same difficulty and sometimes recording it via OBS (just like they were). What I didn't have was a AM5 motherboard, I have and older AM4 motherboard which doesn't have these problems.
> False. A few months ago I played it for an entire day and the game was fine. Last week I played it a good portion of Saturday night. I'm in several large HellDivers focused Discord servers and I've not heard a lot of people complaining about it. Maybe 6 months ago or a year ago this was the case, but not now.
I specifically mean the exact time, right after the pelican starts to fly. I keep seeing "<player> left" or "disconnected". Some come back and I have a habit of asking: "Crash?", they respond with "yeah"
I was just about to replace my gpu (4090 at that!), I had them 3 times a session. I did sink a lot of hours to debug that (replaced cables, switched PSUs between desktops) and just gave up. After few weeks, lo and behold, a patch comes out and it all disappears.
A lot of people just repeat hearsay about the game
The answer to every such claim is just: no. But it's click bait gold to the brain damage outrage YouTuber brigade.
Accidentally using a ton of resources might e reveal weaknesses, but it is absolutely not any software vendors problem that 100% load might reveal your thermal paste application sucked or Nvidia is skimping on cable load balancing.
The issue is 1) actually exaggarated in the community, but not without actual substance 2) getting disregarded exactly because of exaggarations. It was a very real thing.
I also happen to have a multi gpu workstation that works flawlessly too
These guys are running an intensive game on the highest difficulty, while streaming and they probably have a bunch of browser windows and other software running background. Any weakness in the system is going to be revealed.
I had performance issues during that time and I had to restart game every 5 matches. But it takes like a minute to restart the game.
When an orbital precision strike reflects off the hull of a factory strider and kills your friend, or eagle one splatters a gunship, or you get ragdolled for like 150m down a huge hill and then a devastator kills you with an impassionate stomp.
Those moments elevate the game and make it so memorable and replayable. It feels like something whacky and new is around every corner. Playing on PS5 I’ve been blessed with hardly any game-breaking bugs or performance issues, but my PC friends have definitely been frustrated at times
In fact, the whole point of their games is that they are coop games where is easy to accidentally kill your allies in hilarious manners. It is the reason for example why to cast stratagems you use complex key sequences, it is intentional so that you can make mistake and cast the wrong thing.
The dialing adds friction to tense situations, which is okay as a mechanic.
The game has semi-regular patches where they seem to fix some things and break others.
The game has a lot of hidden mechanics that isn't obvious from the tutorial e.g. many weapons have different fire modes, fire rates and stealth is an option in the game. The game has a decent community and people friendly for the most part, it also has the "feature" of being able to be played for about 20-40 minutes and you can just put it down again for a bit and come back.
The missing information also encourages positive interactions among the community - newer players are expected to be missing lots of key information, so teaching them is a natural and encouraged element of gameplay.
I stopped playing the game awhile ago, but the tutorial always struck me as really clever.
i want to play the game, like now, and i'll read the forums after i figure out that i'm missing something imporant
People make much more smooth and complex experiences in old engines.
You need to know your engine as a dev and dont cross its limits at the costs of user-experiences and then blame your tools....
The whole story about more data making load times better is utter rubbish. Its a sign of pisspoor resource management and usage. For the game they have, they should have realized a 130GB install is unacceptable. It's not like they have very elaborate environments. A lot of similar textures and structures everywhere.. its not like its some huge unique world like The Witcher or such games...
There is an astronomical amount of information available for free on how to optimise game engines, loads of books, articles, courses.
How much money do you think they have made so far?
"Arrowhead Game Studios' revenue saw a massive surge due to Helldivers 2, reporting around $100 million in turnover and $76 million in profit for the year leading up to mid-2025, significantly increasing its valuation and attracting a 15.75% investment from Tencent"
75 million in profit but can't figure out how to optimise a game engine. get out.
The fact it is un-optimised can be forgiven because the game has plenty of other positives so people like myself are willing to look over them.
I've got a few hundred hours in the game (I play for maybe an hour in the evening) and for £35 it was well worth the money.
A fun game is a fun game.
If anything, it's a testament to how good a job they've done making the game.
Is that supposed to be praise?
I played it a bit after release and have 230 hours. I liked the game and it was worth my money.
I do credit their sense of humor about it though.
Was it a bad game? Or jankey? What parts of Helldivers are "making sense" now?
You cast spells in a similar way as calling in strategems in hd2.
The spell system was super neat. There’s several different elements (fire, air, water, earth, electricity, ice, ands maybe something else. It’s been a while since I played). Each element can be used on its own or is combinable. Different combinations would cast different spells. Fire+water makes steam for instance. Ice + air is a focused blizzard, etc.
there’s hundreds to learn and that’s your main weapon in the game. There’s even a spell you can cast that will randomly kick someone you’re playing with out of the game.
It’s great fun with friends, but can be annoying to play sometimes. If you try it, go with kb/m. It supports controller, but is way more difficult to build the spells.
Water, Life, Arcane, Shield, Lightning, Cold, Fire, and Earth. [0] It's worth noting that, though you can combine most of the elements to form new spells (and with compounding effects, for example wetting or steaming an enemy enhances lightning damage), you cannot typically combine opposites like lightning/ground, which will instead cancel out. Killed myself many times trying to cast lightning spells while sopping wet.
In my experience, though, nobody used the element names—my friends and I just referred to them by their keybinds. QFASA, anyone?
[0] https://magicka.fandom.com/wiki/Elements
More or less nothing is optimized these days, and game prices and budgets have gone through the roof. Compared to the other games available these days (combined with how fun the game is) I definitely give HD2 a big pass on a lot of stuff. I'm honestly skeptical of Sony's involvement being a benefit, but that's mostly due to my experience regarding their attempts to stuff a PSN ID requirement into HD2 as well as their general handling of their IPs. (Horizon Zero Dawn is not only terrible, but they seem to try to force interest with a new remake on a monthly basis.)
Not true, lots of games are optimized, but it's one of those tasks that almost no one notices when you do it great, but everyone notices when it's missing, so it's really hard to tell by just consuming ("playing") games.
> I'm honestly skeptical of Sony's involvement being a benefit
I'm not, SIE have amazing engineers, probably the best in the industry, and if you have access to those kind of resources, you use it. Meanwhile, I agree that executives at Sony sometimes have no clue, but that doesn't mean SIE helping you with development suddenly has a negative impact on you.
I don't mean this is a counter-argument -- I'm really interested. What are some good examples of very recent optimized games?
Thanks for the list!
Yeah, Unreal Engine (5 almost specifically) is another example of things that are unoptimized by default, very easy to notice, but once you work on it, it becomes invisible and it's not that people suddenly cheer, you just don't hear complaints about it.
It's also one of those platforms where there is a ton of help available from Epic if you really want it, so you can tune the defaults BEFORE you launch your game, but hardly anyone seemingly does that, and then both developers and users blame the engine, instead of blaming the people releasing the game. It's a weird affair all around :)
I'm not sure having the support of Sony is that gold-standard imprint that people think it is.
As an example for overly realistic physics, projectile damage is affected by projectile velocity, which is affected by weapon velocity. IIRC, at some point whether you were able to destroy some target in two shots of a Quasar Cannon or three shots depended on if you were walking backwards while you were firing, or not.
That sounds like a bug, not an intentional game design choice about the game logic, and definitely unrelated to realism vs not realism. Having either of those as goals would lead to "yeah, bullet velocity goes up when you go backwards" being an intentional mechanic.
Game Freak could not finish the project, so they had to be bailed by Nintendo with an easy-to-program game so the company could get some much needed cash (the Yoshi puzzle game on NES). Then years later, with no end to the game in sight, Game Freak had to stoop to contracting Creatures inc. to finish the game. Since they had no cash, Creatures inc. was paid with a portion of the Pokémon franchise.
Pokémon was a shit show of epic proportions. If it had been an SNES game it would have been canceled and Game Freak would have closed. The low development cost of Game Boy and the long life of the console made Pokémon possible.
Data-sizes has continued to grow and HDD-seek times haven't gotten better due to physics (even if streaming probably has kept up), the assumption isn't too bad considering history.
It's a good that they actually revisited it _when they had time_ because launching a game, especially a multiplayer one, will run into a lot of breaking bugs and this (while a big one, pun intended) is still by most classifications a lower priority issue.
I'm not an arrowhead employee, but my guess is at some point in the past, they benchmarked it, got a result, and went with it. And that's about all there is to it.
To be fair, the massive install size was probably the least of the problems with the game, it's performance has been atrocious, and when they released for xbox, the update that came with it broke the game entirely for me and was unplayable for a few weeks until they released another update.
In their defense, they seem to have been listening to players and have been slowly but steadily improving things.
Playing Helldivers 2 is a social thing for me where I get together online with some close friends and family a few times a month and we play some helldivers and have a chat, aside from that period where I couldn't play because it was broken, it's been a pretty good experience playing it on Linux; even better since I switched from nvidia to AMD just over a week ago.
I'm glad they reduced the install size and saved me ~130GB, and I only had to download about another 20GB to do it.
>We now know that, contrary to most games, the majority of the loading time in HELLDIVERS 2 is due to level-generation rather than asset loading. This level generation happens in parallel with loading assets from the disk and so is the main determining factor of the loading time. We now know that this is true even for users with mechanical HDDs.
they did absolutely zero benchmarking beforehand, just went with industry haresay, and decided to double it just in case.
Do you benchmark every single decision you make on every system on every project you work on? Do you check that redis operation is actually O(1) or do you rely on hearsay. Do you benchmark every single SQL query, every DTO, the overhead of the DI Framework, connection pooler, json serializer, log formatter? Do you ever rely on your own knowledge without verifying the assumptions? Of course you do - you’re human and we have to make some baseline assumptions, and sometimes they’re wrong.
You will be surprised what some people are playing games on. e.g. I know people that still use Windows 7 on a AMD BullDozer rig. Atypical for sure, but not unheard of.
old stuff is common, and doubly so for a lot of the world, which ain't rich and ain't rockin new hardware
Pretending that this is an outrageous decision when the data and the commonly assumed wisdom was that there were still a lot of people using HDDs.
They've since rectified this particular issue and there seems to be more criticism of the company after fixing an issue.
https://en.wikipedia.org/wiki/Wikipedia:Chesterton%27s_fence
It was a real issue in the past with hard drives and small media assets. It's still a real issue even with SSDs. HDD/SSD IOPS are still way slower than contiguous reads when you're dealing with a massive amount of files.
At the end of the day it requires testing which requires time at a time you don't have a lot of time.
The Fence is a parable about understanding something that already exists before asking to remove it. If you cannot explain why it exists, you shouldn't ask to remove it.
In this case, it wasn't something that already existed in their game. It was something that they read, then followed (without truly understanding whether it applied to their game), and upon re-testing some time later, realized it wasn't needed and caused detrimental side-effects. So it's not Chesterton's Fence.
You could argue they followed a videogame industry practice to make a new product, which is reasonable. They just didn't question or test their assumptions that they were within the parameters of said industry practice.
I don't think it's a terrible sin, mind you. We all take shortcuts sometimes.
>wait for the whole list to finish rather than blocking on every tiny file.
And this is the point. I can make a test that shows exactly what's going on here. Make a random file generator that generates 100,000 4k files. Now, write them on hard drive with other data and things going on at the same time. Now in another run of the program have it generate 100,000 4k files and put them in a zip.
Now, read the set of 100k files from disk and at the same time read the 100k files in a zip....
One finishes in less than a second and one takes anywhere from a few seconds to a few minutes depending on your disk speeds.
This has nothing to do with consoles, and only affects PC builds of the game
So the PS5's SSD architecture was what developers were familiar with when they tried to figure out what changes would be needed to make the game work on PC.
Maybe you're saying the hearsay was Sony exaggerating how bad hard drives are? But they didn't really do that, and the devs would already have experience with hard drives.
They basically just made the numbers up. Wild.
The wife cuts the end off of the ham before putting it in the oven. The husband, unwise in the ways of cooking, asks her why she does this.
"I don't know", says the wife, "I did it because my mom did it."
So they call the mom. It turns out that her mother did it, so she did too.
The three of them call the grandma and ask "Why did you cut the end off of the ham before cooking it?"
The grandma laughs and says "I cut it off because my pan was too small!"
> The pop-culture cargo cult description, however, takes features of some cargo cults (the occasional runway) and combines this with movie scenes to yield an inaccurate and fictionalized dscription. It may be hard to believe that the description of cargo cults that you see on the internet is mostly wrong, but in the remainder of this article, I will explain this in detail.
As an aside, I do enjoy the modding community naming over multiple iterations of mods - "better loading" -> "better better loading" -> "best loading" -> "simplified loading" -> "x's simplified loading" -> "y's simplified loading" -> "z's better simplified loading". Where 'better' is often some undisclosed metric based on some untested assumptions.
It was a fundamentally sound default that they revisited. Then they blogged about the relatively surprising difference it happen to make in their particular game. As it turns out the loading is CPU bound anyway, so while the setting is doing it's job, in the context of the final game, it happens to not be the bottle neck.
There's also the movement away from HDD and disc drives in the player base to make that the case as well.
I don’t know about the Xbox, but on PS4 the hard drive was definitely not fast at all
>we looked at industry standard values and decided to double them just in case.
it had no serious or glaring impact to their bottom line.
thus it was the right call, and if they didn't bother to fix it they'd still be rolling in $$$$
It will make them a lot of money and is thus the right call. Who cares about customers am I right? They'd still be rolling in $$$$.
Games would be much better if all people making them were forced to spend a few days each month playing the game on middle-of-the-road hardware. That will quickly teach them the value of fixing stuff like this and optimising the game in general.
In my last project, the gameplay team played every single day.
> Games would be much better if all people making them were forced to spend a few days each month playing the game on middle-of-the-road hardware
How would playing on middle of the road hardware have caught this? The fix to this was to benchmark the load time on the absolute bottom end of hardware, with and without the duplicated logic. Which you'd only do once you have a suspicion that it's going to be faster if you change it...
Pay 2000$ for indie games so studios could grow up without being beholden to shareholders and we could perhaps get that "perfect" QA,etc.
It's a fucking market economy and people aren't making pong level games that can be simply tuned, you really get what you pay for.
That’s how we wound up with this game where your friends are as much of a liability as your enemies.
Instead they did blindly did extra work and 6x’ed the storage requirement.
> multi minute load times
23Gb / 100mb / 60s = 3.92m
So in the worst case when everything is loaded at once (how on a system with < 32Gb RAM?) it takes 4 minutes.
Considering GTA whatever version could sit for 15 minutes at the loading screen because nobody bothered to check why - the industry could really say not to bother.
I feel like writes would probably be quite painful, but with game assets are essentially write-once read-forever so not the end of the world?
As an aside, its messed up that people with expensive SSDs are unnecessarily paying this storage tax. Just feels lazy...
282 comments
I still don't know but found instead an interesting reddit post were users found and analyzed this "waste of space" three month ago.
https://www.reddit.com/r/Helldivers/comments/1mw3qcx/why_the...
PS: just found it. According to this Steam discussion it does not download the duplicate data and back then it only blew up to ~70 GB.
https://steamcommunity.com/app/553850/discussions/0/43725019...
[0] https://partner.steamgames.com/doc/sdk/uploading#AppStructur...
You never assume something is an optimization or needed and never do hypothetical optimizations
I can see why it would happen in this case though, gamedev is chaotic and you're often really pressed for time
Then I realized you said build systems and eh, whatever. It's not good for build systems to be bloated, but it matters a lot less than the end product being bloated.
And you seem to be complaining about the people that are dealing with these build systems themselves, not inflicting them on other people? Why don't they get to complain?
But that’s all beside the point. What I was really doing was criticizing the <waves hands wildly> HN commenters. HN posters are mostly webdevs because most modern programmers are webdevs. And while I won’t say the file bloat here wasn’t silly, I wonder stand for game dev slander from devs that commit faaaaaaaaaaaaaar greater sins.
Wow! It looks like I do indeed know better.
[0] https://news.ycombinator.com/item?id=10066338
they're a fantastically popular franchise with a ton of money... and did it without the optimizations.
if they never did these optimizations they'd still have a hugely popular, industry leading game
minor tweaks to weapon damage will do more to harm their bottom line compared to any backend optimization
That being said, cartridges were fast. The move away from cartridges was a wrong turn
Cartridges were also crazy expensive. A N64 cartridge cost about $30 to manufacture with a capacity of 8MB, whereas a PS1 CD-ROM was closer to a $1 manufacturing cost, with a capacity of 700MB. That's $3.75/MB versus $0.0014/MB - over 2600x more expensive!
Without optical media most games from the late 90s & 2000s would've been impossible to make - especially once it got to the DVD era.
I've kinda given up on physical games at this point. I held on for a long time, but the experience is just so bad now. They use the cheapest, flimsiest, most fragile plastic in the cases. You don't get a nice instruction manual anymore. And honestly, keeping a micro SD card in your system that can hold a handful of games is more convenient than having to haul around a bunch of cartridges that can be lost.
I take solace in knowing that if I do still have a working Switch in 20 years and lose access to games I bought a long time ago, hopefully the hackers/pirates will have a method for me to play them again.
You've been paying attention to the wrong sources for information about NAND flash. A new Switch cartridge will have many years of reliable data retention, even just sitting on a shelf. Data retention only starts to become a concern for SSDs that have used up most of their write endurance; a Switch cartridge is mostly treated as ROM and only written to once.
I've read about people's 3DS cartridges already failing just sitting on a shelf.
I've been on PS5 since launch and aside from Baldur's Gate 3, it's been the best game this gen IMO.
The negativity I see towards the game (especially on Youtube) is weird. Some of the critiques seem legit but a lot of feels like rage bait, which appears to be a lot of YT videos around gaming lately.
Anyway, a big improvement for a great game. Seems like less of an incentive now to uninstall if you only play now and then.
iO
Just don’t get caught at the end!
https://nee.lv/2021/02/28/How-I-cut-GTA-Online-loading-times...
I’ve got to say. I do find it somewhat unusual that despite the fact that every HN engineer has John Carmack level focus on craftsmanship, about 1/100k here produce that kind of outcome.
I don’t get it. All of you guys are good at pointing out how to do good engineering. Why don’t you make good things?
https://www.arrowheadgamestudios.com/2025/10/helldivers-2-te...
But for a mechanical drive, you'll get much better throughput on sequential reads than random reads, even with command queuing. I think earlier discussion showed it wasn't very effective in this case and taking 6x the space for a marginal benefit for the small % of users with mechanical drives isn't worth while...
This does not work if you're doing tons of small IO and you want something fast.
Lets say were on a HDD with 200IOPS and we need to read 3000 small files randomly across the hard drive.
Well, at minimum this is going to take 15's seconds plus any additional seek time.
Now, lets say we zip up those files in a solid archive. You'll read it in half a second. The problem comes in when different levels all need different 3000 files. Then you end deduping a bunch of stuff.
Now, where this typically falls apart for modern game assets is they are getting very large which tends to negate seek times by a lot.
For asynchronous IO you can just do inward/outward passes to amortize the seek time over multiple files.
While it may not have been obvious, I have taken archiving or bundling of assets into a bigger file for granted. The obvious benefit is that the HDD knows that it should store game files continuously. This has nothing to do with file duplication though and is a somewhat irrelevant topic, because it costs nothing and only has benefits.
The asynchronous file IO case for bundled files is even better, since you can just hand over the internal file offsets to the async file IO operations and get all the relevant data in parallel so your only constraint is deciding on an optimal lower bound for the block size, which is high for HDDs and low for SSDs.
>For asynchronous IO you can just do inward/outward passes to amortize the seek time over multiple files.
Here's a random blog post that has benchmarks for a 2015 HDD:
https://davemateer.com/2020/04/19/Disk-performance-CrystalDi...
It shows 1.5MB/s for random 4K performance with high queue depth, which works out to just under 400 IOPS. 1 queue depth (so synchronous) performance is around a third.
As the other user stated, just look up Crystal Disk Info results for both HDDs and SSD and you'll see hard drives do about 1/3rd of a MBPs on random file IO while the same hard drive will do 400MBps on a contiguous read. For things like this reading a zip and decompressing in memory is "typically" (again, you have to test this) orders of magnitude faster.
It's a well known technique but happened to not be useful for their use case.
That does force you to duplicate some assets a lot. It's also more important the slower your seeks are. This technique is perfect for disc media, since it has a fixed physical size (so wasting space on it is irrelevant) and slow seeks.
I'd love to see it analysed. Specifically, the average number of nonseq jumps vs overall size of the level. I'm sure you could avoid jumps within megabytes. But if someone ever got closer to filling up the disk in the past, the chances of contiguous gigabytes are much lower. This paper effectively says that if you have long files, there's almost guaranteed gaps https://dfrws.org/wp-content/uploads/2021/01/2021_APAC_paper... so at that point, you may be better off preallocating the individual does where eating the cost of switching between them.
But it also depends on how the assets are organized, you can probably group the level specific assets into a sequential section, and maybe shared assets could be somewhat grouped so related assets are sequential.
Nowadays? No. Even those with hard disks will have lots more RAM and thus disk cache. And you are even guaranteed SSDs on consoles. I think in general no one tries this technique anymore.
By default, Windows automatically defragments filesystems weekly if necessary. It can be configured in the "defragment and optimize drives" dialog.
https://web.archive.org/web/20100529025623/http://blogs.tech...
old article on the process
Someone installing a 150GB game sure do have 150GB+ of free space and there would be a lot of continuous free space.
If you break it up into smaller files, those are likely to be allocated all over the disk; plus you'll have delays on reading because windows defender makes opening files slow. If you have a single large file that contains all resources, even if that file is mostly sequential, there will be sections that you don't need, and read ahead cache may work against you, as it will tend to read things you don't need.
Which makes me think: Has there been any advances in disk scheduling in the last decade?
If the game was ~20GB instead of ~150GB almost no player with the required CPU+GPU+RAM combination would be forced to put it on a HDD instead of a SSD.
Hard drives are much, much faster than optical media - on the order of 80 seeks per second and 300 MB/s sequential versus, like, 4 seeks per second and 60 MB/s sequential (for DVD-ROM).
You still want to load sequential blocks as much as possible, but you can afford to have a few. (Assuming a traditional engine design, no megatextures etc) you probably want to load each texture from a separate file, but you can certainly afford to load a block of grass textures, a block of snow textures, etc. Also throughput is 1000x higher than a PS1 (300 kB/s) so you can presumably afford to skip parts of your sequential runs.