• funnystuff97@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    10 months ago

    On a similar vein, Arkham Knight (and in some cases Arkham City) looked worse in cutscenes if you maxed out the graphics settings. Obviously not if you ran it on a potato, but the games are somewhat well optimized these days*.

    *At launch, Arkham Knight was an unoptimized, buggy mess. It has since gotten much better.

    • nevetsg@aussie.zone
      link
      fedilink
      arrow-up
      4
      ·
      10 months ago

      I am playing through Rise of Tomb Raider in 4K and having a similar experience. I think the cut scenes are in 1080p.

    • Otherwise_Direction7@monyet.ccOP
      link
      fedilink
      arrow-up
      3
      ·
      10 months ago

      Wait you mean that the game’s gameplay looks better than the actual cutscenes in the game?

      But how? Does the game use FMV for the cutscenes or something?

      • funnystuff97@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        10 months ago

        The cutscenes were rendered using certain graphics settings that you could exceed if you maxed out your own settings. Plus, because it was a pre-rendered video, there must have been some compression or something, as you could just tell when you’re in a cutscene-- it was grainier and there was a smidge of artifacting. Don’t quote me on this, but I believe the cutscenes were rendered at, like, 1080p, and if you were playing at 4K it would be a very noticeable downgrade. (Note that I did not and still do not have a 4K monitor)

        Although thinking about it again, I do vividly remember some in-game-engine cutscenes in Arkham Knight. I’ll have to replay that game again sometime to jog my memory.