How does this KEEP GETTING WORSE??

  • starman2112@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    8 months ago

    And this is not even beginning to touch content and features from other released versions of these games from 20 years ago not present, like four-screen splitscreen."

    It’s so cool and amazing that we finally have home theatre systems in every fucking house, and that’s when devs decided we don’t get split screen anymore. Modern hardware is wasted on modern devs. Can we send them back in time to learn how to optimize, and bring back the ones that knew how to properly utilize hardware?

    • Saik0@lemmy.saik0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 months ago

      It’s so cool and amazing that we finally have home theatre systems in every fucking house

      Yeah I’ve noticed this too and it bothers me. We had 4 way split on 20inch tube tvs on hardware that measure their ram in MBs… But on modern 75+inch tvs on consoles with GBs of ram… Nah, too hard. You need to buy 4 copies of the game and have 4 separate setups… and probably need to be in 4 separate houses.

      Couch co-op dying is basically when I stopped bothering with consoles all together. If I’m going to use a glorified PC, might as well just use a full fat PC and ignore consoles all together. I miss the N64 days.

      • barsoap@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        8 months ago

        We had 4 way split on 20inch tube tvs on hardware that measure their ram in MBs

        And were still compute-bound. Things like the N64 pretty much used resources per pixel, mesh data being so light that the whole level could be in the limited RAM at the same time – and needed to be because there weren’t CPU cycles left over to implement asset streaming. Nowadays the only stuff that is in RAM is what you actually see, and with four perspectives, yes, you need four times the VRAM as every player can look at something completely different.

        Sure you can write the game to use 1/4th the resources but then you either use that for singleplayer and get bad reviews for bad graphics, or you develop two completely different sets of assets, exploding development costs. I’m sure there also exist shady kitten-drowing marketing fucks who would object on reasons of “but hear me out, let’s just sell them four copies instead” but they don’t even get to object because production-wise split-screen isn’t an option nowadays for games which aren’t specifically focussing on that kind of thing. You can’t just add it to any random title for a tenner.

    • ipkpjersi@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      8 months ago

      Modern hardware is wasted on modern devs. Can we send them back in time to learn how to optimize, and bring back the ones that knew how to properly utilize hardware?

      I think a lot of the blame is erroneously placed on devs, or it’s used as a colloquialism. Anyone who has worked in a corporate environment as a developer knows that the developers are not the ones making the decisions. You really think that developers want to create a game that is bad, to have their name attached to something that is bad and to also know that they created something that is bad? No, developers want to make a good game, but time constraints and horrible management prioritizing the wrong things (mostly, microtransactions, monetizing the hell out of games, etc) results in bad games being created. Also, game development is more complex since games are more complex, hardware is more complex, and developers are expected to produce results in less time than ever before - it’s not exactly easy, either.

      It’s an annoyance of mine and I’m sure you meant no harm by it, but as a developer (and as someone who has done game development on the side and knows a lot about the game development industry), it’s something that bothers me when people blame bad games solely on devs, and not on the management who made decisions which ended up with games in a bad state.

      With that said, I agree with your sentiments about modern hardware not being able to take advantage of long-forgotten cool features like four-screen splitscreen, offline modes (mostly in online games), arcade modes, etc. I really wish these features were prioritized.

    • catloaf@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      8 months ago

      It’s not a question of capability. It’s a question of cost-benefit spending developer time on a feature not many people would use.

      Couch coop was a thing because there was no way for you to play from your own homes. Nowadays it’s a nice-to-have, because you can jump online any time and play together, anywhere in the world, without organizing everyone to show up at one house.

      • I Cast Fist@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        It’s a question of cost-benefit spending developer time on a feature not many people would use

        Which is super ironic when you look at games that had an obviously tacked-on, rushed multiplayer component in the first place, such as Spec Ops: The Line, Bioshock 2 and Mass Effect 3