Copied and pasted from broken-forum for your viewing pleasure, just to make it clear that whinging about the 'poor console port' is a severely annoying case of simple gamer entitlement.
---------
Also, because this issue really *** annoys me, massive incoming
technical post with reasons why it may have been technically difficult
or visually undesirable to have higher framebuffer resolutions or
framerates. As a bonus, I’ll likely throw in other stuff they dealt with
which might have impacted their decisions on rendering related changes.
I will also rate each independent point with two 1-10 scales, the first
representing how much entitled PC gamers would ***** and moan about it,
with 10 being the Whiny Entitled PC Gamer Who Chooses Not To Buy It
Because It is a Total Deal Breaker, Man, and the second representing
development cost, with 10 being "*** it, bin the project, it costs too
much."
1. UI.
This one is low hanging fruit. But if you know exactly what resolution
your game will be, often it is significantly easier to build all of the
UI in such a fashion that it just lands on screen where you want it.
This means all the game UI could in theory be on one large texture
that's just slapped to the screen and that's that. Even if they didn't
do that, the resolution is guaranteed to be perfect, and the positions
are close to guaranteed to be hardcoded. Meaning that if they were to
up-res the framebuffer, you would have huge chunky blocky UI that would
be immediately at odds with the rest of the game's high resolution. To
fix it would require rewriting a large part of the UI system to either
properly scale everything, or properly position relative to screen
edges, and having the artists completely redraw all of the UI such that
it would look good or better at higher resolutions.
Whiny: 7
Cost: 4
2. Texture mapping (including normal maps).
Given the game's internal low resolution, the look of their art was
probably balanced such that the artists knew the target resolution.
Given the rough size of enemies on screen, and the graphical look of the
game, I'm expecting they made heavy use of low res normal maps in order
to get the level of detail they wanted on characters and enemies. Were
you to upres the framebuffer without creating new normal maps, it's
possible for characters to suddenly look like they are all wearing
outfits made of small colored bathroom tile, as a single pixel of a
normal map would map to significantly more screen space in a roughly
square fashion.
The textures will also be nearly the same resolution as the texture
maps, because if they are too drastically different they'd look
absolutely terrible.
Whiny: 5
Cost: 7 (10 if including the world).
3. Low polygon models
The game world is large and open enough that the character and enemy
models are likely quite low res, only you can't notice it at their
target framebuffer size. Clever use of texture mapping and normal
mapping is what generally lets them get away with this. But at a higher
resolution, the magic disappears and suddenly you are looking at blocky
models. Which is *especially* apparent if they have low resolution
textures.
Whiny: 4
Cost: 10
4. Fill rate.
A lot of the really interesting and cool effects they have for a lot of
the enemies, bosses especially (Sif immediately comes to mind), use a
ton of fill rate by massively layering transparent polygons or
particles. The cost for these kind of effects increases exponentially
with render size in pixels. Fixing it would require remodelling,
retexturing, and likely redesigning the problem models so they don't
look like complete ***, and don't drop the framerate to single digits
when they suddenly take up the entire screen.
Some math (assuming Sif has about 8 layers of fur, which seems likely from the screens I've examined):
Frame buffer at 1024x720, wolf fills the screen:
This means it has to draw 1024x720x8 pixels in a worst case. That’s 5.9m
pixels. Per frame, of course. So at thirty FPS it’s trying to use about
17.7 megapixels of fill rate.
Frame buffer at 1920x1080 (cause if you are a pc gamer, I’m sure you
have at least this, otherwise what the *** are you complaining about?):
1920x1080x8 pixels in a worst case. 16.6m pixels. Per frame. That’s 49.8 megapixels of fill rate.
Of course, videocards don’t measure pixel fill rate, they measure
texture fill rate, and when 3d rendering, nearly everything counts as a
texture. Lighting? Check. Shadows? Check. Textures, normalmaps, spec
maps, alpha maps... check check check. You get the idea. That 50
megapixels very quickly becomes 300-400. For a single character.
But wait! You say. Modern video cards are much faster than the consoles!
BZZZZT. They are, but it doesn’t tell the whole story. Console video
chips have specific optimizations based on how developers tend to use
them. As such they can do things like transparencies and FSAA for free.
Or nearly so.
Oh you wanted some kind of AA on Sif? Well on PC that just doubled or
quadrupled your frame buffer. So now you are using somewhere between 1
and 2 gigatexels of fill rate.
Whine: 9 (I can’t fight sif! the game slows to a crawl!)
Cost: 9
5. Shader Languages.
This is where they take the biggest hit on the port, and where they have
likely focused most of their work. Because they have a 360 and PS3
version, they obviously have some kind of shader abstraction going on.
But the problem is, when you hit PC, different videocards support
different things when it comes to shader languages, and using the wrong
thing at the wrong time can take a 60fps game down to nothing. On 360
and PS3 this isn’t an issue but on PC? You bet it is. In fact, it’s
something you can’t ignore, despite the cost of testing, debugging, and
profiling on a ton of video cards. On a modern engine? ****, this has
been done for you (or mostly). But on the one they used? It’s only there
as a rough helping hand.
Christ, even when making simple PC games nowadays, you’ll find features
you take for granted that just don’t work for **** on common videocards.
Locking the framebuffer resolution may have allowed them to take
shortcuts for problem graphics chips.
Whine: 10
Cost: 7
6. Online Stuff
I’ve already gone into this further up in the thread, but this is not a
trivial thing either. But they made a choice which was great for them
and likely enabled the project at all.
Whine: 9
Cost: 10 ( Non-GFWL ), 2 (GFWL).
7. Animation Quality.
Animation can take up a lot of space, especially when you have multiple
skeletons (they have unique skeletons for everything in the game as far
as I can tell), and when there’s a lot of bones per skeleton (oh, there
is). One way people get around this is by using very high rates of
animation compression. Well, that’s what you do when you can’t use a
single skeleton (which is what a vast majority of games these days do).
What animation compression does is reduce the size of the animations in
memory, but it also introduces a jittery aspect to the motion. Ever seen
a character’s feet float around on the ground when they were standing
still? Animation compression.
Using a lower framebuffer can hide some of that jittering, which would otherwise look fairly terrible.
Whine: 4 (6 if you have crashing due to running out of memory from less compressed animations).
Cost: 2 (reduce animation compression), 7 (change animation compression
algorithm), 10 (try and change skeletons/reduce raw animation cost).
8: Timing Calculations
For those of you who don’t know how to make games, every frame the game
takes a rough count of how much time has passed since the last frame,
and calculates a new game state. That’s moving things, rendering things,
animating things, etc.
The problem with Delta Time, or DT as we call it, is that if you are
working such that you always have a known or high DT (High being lower
framerate), there’s a *** of code bugs that will never get seen.
From particles that don’t work (It normally looks like fire! But now it
looks like a laser beam into the sky!), to physics that freak out (When I
kill that enemy he stretches to infinity!), to things that to the
layman simply don’t make sense at all (My attacks don’t hit anymore! I
fall through the world! The enemy only ever turns left!).
Finding and ironing out all these issues after the fact? It’s close to
impossible. Especially when some of those issues may have to do with
fundamental architecture assumptions.
Whine: 8
Cost: 10
9. Single threaded game updated.
Given the PS3 only has one general purpose CPU, it’s not irrational to
think they may have a single threaded game update. Depending on choices
they made, that same game update may have to wait for the frame render
to complete in between updates. If this is the case, then, given the
fact that we already know their AI eats up a *** of CPU, it’s likely
that in this case they have to keep the render costs extremely low in
order to have a playable framerate at all.
The reason I think this may be the case is that traditionally japanese
developers have worked this way in order to target their games for a
locked 60 frames per second. But they are also used to building games
with very little update logic (AI and such), so they could traditionally
keep CPU costs for things other than rendering low.
But I’ve seen how poorly modern games can perform in these scenarios, so
if they did build it this way, they’d have little choice in these
matters.
Whine: 8 (poor framerates)
Cost: 10
---------
Also, because this issue really *** annoys me, massive incoming
technical post with reasons why it may have been technically difficult
or visually undesirable to have higher framebuffer resolutions or
framerates. As a bonus, I’ll likely throw in other stuff they dealt with
which might have impacted their decisions on rendering related changes.
I will also rate each independent point with two 1-10 scales, the first
representing how much entitled PC gamers would ***** and moan about it,
with 10 being the Whiny Entitled PC Gamer Who Chooses Not To Buy It
Because It is a Total Deal Breaker, Man, and the second representing
development cost, with 10 being "*** it, bin the project, it costs too
much."
1. UI.
This one is low hanging fruit. But if you know exactly what resolution
your game will be, often it is significantly easier to build all of the
UI in such a fashion that it just lands on screen where you want it.
This means all the game UI could in theory be on one large texture
that's just slapped to the screen and that's that. Even if they didn't
do that, the resolution is guaranteed to be perfect, and the positions
are close to guaranteed to be hardcoded. Meaning that if they were to
up-res the framebuffer, you would have huge chunky blocky UI that would
be immediately at odds with the rest of the game's high resolution. To
fix it would require rewriting a large part of the UI system to either
properly scale everything, or properly position relative to screen
edges, and having the artists completely redraw all of the UI such that
it would look good or better at higher resolutions.
Whiny: 7
Cost: 4
2. Texture mapping (including normal maps).
Given the game's internal low resolution, the look of their art was
probably balanced such that the artists knew the target resolution.
Given the rough size of enemies on screen, and the graphical look of the
game, I'm expecting they made heavy use of low res normal maps in order
to get the level of detail they wanted on characters and enemies. Were
you to upres the framebuffer without creating new normal maps, it's
possible for characters to suddenly look like they are all wearing
outfits made of small colored bathroom tile, as a single pixel of a
normal map would map to significantly more screen space in a roughly
square fashion.
The textures will also be nearly the same resolution as the texture
maps, because if they are too drastically different they'd look
absolutely terrible.
Whiny: 5
Cost: 7 (10 if including the world).
3. Low polygon models
The game world is large and open enough that the character and enemy
models are likely quite low res, only you can't notice it at their
target framebuffer size. Clever use of texture mapping and normal
mapping is what generally lets them get away with this. But at a higher
resolution, the magic disappears and suddenly you are looking at blocky
models. Which is *especially* apparent if they have low resolution
textures.
Whiny: 4
Cost: 10
4. Fill rate.
A lot of the really interesting and cool effects they have for a lot of
the enemies, bosses especially (Sif immediately comes to mind), use a
ton of fill rate by massively layering transparent polygons or
particles. The cost for these kind of effects increases exponentially
with render size in pixels. Fixing it would require remodelling,
retexturing, and likely redesigning the problem models so they don't
look like complete ***, and don't drop the framerate to single digits
when they suddenly take up the entire screen.
Some math (assuming Sif has about 8 layers of fur, which seems likely from the screens I've examined):
Frame buffer at 1024x720, wolf fills the screen:
This means it has to draw 1024x720x8 pixels in a worst case. That’s 5.9m
pixels. Per frame, of course. So at thirty FPS it’s trying to use about
17.7 megapixels of fill rate.
Frame buffer at 1920x1080 (cause if you are a pc gamer, I’m sure you
have at least this, otherwise what the *** are you complaining about?):
1920x1080x8 pixels in a worst case. 16.6m pixels. Per frame. That’s 49.8 megapixels of fill rate.
Of course, videocards don’t measure pixel fill rate, they measure
texture fill rate, and when 3d rendering, nearly everything counts as a
texture. Lighting? Check. Shadows? Check. Textures, normalmaps, spec
maps, alpha maps... check check check. You get the idea. That 50
megapixels very quickly becomes 300-400. For a single character.
But wait! You say. Modern video cards are much faster than the consoles!
BZZZZT. They are, but it doesn’t tell the whole story. Console video
chips have specific optimizations based on how developers tend to use
them. As such they can do things like transparencies and FSAA for free.
Or nearly so.
Oh you wanted some kind of AA on Sif? Well on PC that just doubled or
quadrupled your frame buffer. So now you are using somewhere between 1
and 2 gigatexels of fill rate.
Whine: 9 (I can’t fight sif! the game slows to a crawl!)
Cost: 9
5. Shader Languages.
This is where they take the biggest hit on the port, and where they have
likely focused most of their work. Because they have a 360 and PS3
version, they obviously have some kind of shader abstraction going on.
But the problem is, when you hit PC, different videocards support
different things when it comes to shader languages, and using the wrong
thing at the wrong time can take a 60fps game down to nothing. On 360
and PS3 this isn’t an issue but on PC? You bet it is. In fact, it’s
something you can’t ignore, despite the cost of testing, debugging, and
profiling on a ton of video cards. On a modern engine? ****, this has
been done for you (or mostly). But on the one they used? It’s only there
as a rough helping hand.
Christ, even when making simple PC games nowadays, you’ll find features
you take for granted that just don’t work for **** on common videocards.
Locking the framebuffer resolution may have allowed them to take
shortcuts for problem graphics chips.
Whine: 10
Cost: 7
6. Online Stuff
I’ve already gone into this further up in the thread, but this is not a
trivial thing either. But they made a choice which was great for them
and likely enabled the project at all.
Whine: 9
Cost: 10 ( Non-GFWL ), 2 (GFWL).
7. Animation Quality.
Animation can take up a lot of space, especially when you have multiple
skeletons (they have unique skeletons for everything in the game as far
as I can tell), and when there’s a lot of bones per skeleton (oh, there
is). One way people get around this is by using very high rates of
animation compression. Well, that’s what you do when you can’t use a
single skeleton (which is what a vast majority of games these days do).
What animation compression does is reduce the size of the animations in
memory, but it also introduces a jittery aspect to the motion. Ever seen
a character’s feet float around on the ground when they were standing
still? Animation compression.
Using a lower framebuffer can hide some of that jittering, which would otherwise look fairly terrible.
Whine: 4 (6 if you have crashing due to running out of memory from less compressed animations).
Cost: 2 (reduce animation compression), 7 (change animation compression
algorithm), 10 (try and change skeletons/reduce raw animation cost).
8: Timing Calculations
For those of you who don’t know how to make games, every frame the game
takes a rough count of how much time has passed since the last frame,
and calculates a new game state. That’s moving things, rendering things,
animating things, etc.
The problem with Delta Time, or DT as we call it, is that if you are
working such that you always have a known or high DT (High being lower
framerate), there’s a *** of code bugs that will never get seen.
From particles that don’t work (It normally looks like fire! But now it
looks like a laser beam into the sky!), to physics that freak out (When I
kill that enemy he stretches to infinity!), to things that to the
layman simply don’t make sense at all (My attacks don’t hit anymore! I
fall through the world! The enemy only ever turns left!).
Finding and ironing out all these issues after the fact? It’s close to
impossible. Especially when some of those issues may have to do with
fundamental architecture assumptions.
Whine: 8
Cost: 10
9. Single threaded game updated.
Given the PS3 only has one general purpose CPU, it’s not irrational to
think they may have a single threaded game update. Depending on choices
they made, that same game update may have to wait for the frame render
to complete in between updates. If this is the case, then, given the
fact that we already know their AI eats up a *** of CPU, it’s likely
that in this case they have to keep the render costs extremely low in
order to have a playable framerate at all.
The reason I think this may be the case is that traditionally japanese
developers have worked this way in order to target their games for a
locked 60 frames per second. But they are also used to building games
with very little update logic (AI and such), so they could traditionally
keep CPU costs for things other than rendering low.
But I’ve seen how poorly modern games can perform in these scenarios, so
if they did build it this way, they’d have little choice in these
matters.
Whine: 8 (poor framerates)
Cost: 10