Building a new PC, part 2
Apr. 6th, 2008 05:36 pmWorking out what I need for the new machine
Graphics card(s)
The component that makes the most difference to performance in modern 3D games is the graphics card. The key measure of graphics speed is 'fps' - frames per second. This is simply the number of times the picture updates per second. The internet is full of benchmarks comparing one graphics card to another. The higher the resolution that you want to play the game at, typically the lower the fps. Modern TFT LCD monitors prefer to be used at their native resolution, otherwise the picture can be a little blurry (since a physical element of the screen is being used to display say 1.5 pixels). Since my monitor has a native resolution of 1920x1200, the benchmarks I am interested in are the ones that show performance in modern games at 1920x1200, with most of the detail settings in the games turned up to high or very high.
There are really only two companies involved in this market - ATi (now owned by AMD) and NVidia. For the last few years, the two companies have been locked in an arms race to produce graphics cards with a) the best performance and b) the best 'bangs per buck'. However, for most of the last eighteen months, ATi's high end products just haven't been very good, and because of this NVidia hasn't had to release new high end products because they had the fastest cards anyway in the shape of the GeForce 8800GTX, 8800Ultra and 8800GTS.
This has been annoying because the fastest cards were still not fast enough to cope with the most demanding games at high resolutions. One game in particular - Crysis - is so demanding that even the top of the range 8800Ultra could only manage 29fps at 1920x1200. The human eye is only usually thought of as being capable of distinguishing up to about 25fps, so you would think that this is more than enough, but this figure is an average score. When there is more going on on screen, for example other characters, explosions etc, this fps score drops significantly - and the game will be unplayably jerky.
The situation haas changed in the last couple of weeks though, which is why I am deciding to finally upgrade (I had previously planned an upgrade in January - sticking to my usual two year upgrade cycle, but for once it seemed worth waiting for).
So what's changed?
That's for part 3...
Graphics card(s)
The component that makes the most difference to performance in modern 3D games is the graphics card. The key measure of graphics speed is 'fps' - frames per second. This is simply the number of times the picture updates per second. The internet is full of benchmarks comparing one graphics card to another. The higher the resolution that you want to play the game at, typically the lower the fps. Modern TFT LCD monitors prefer to be used at their native resolution, otherwise the picture can be a little blurry (since a physical element of the screen is being used to display say 1.5 pixels). Since my monitor has a native resolution of 1920x1200, the benchmarks I am interested in are the ones that show performance in modern games at 1920x1200, with most of the detail settings in the games turned up to high or very high.
There are really only two companies involved in this market - ATi (now owned by AMD) and NVidia. For the last few years, the two companies have been locked in an arms race to produce graphics cards with a) the best performance and b) the best 'bangs per buck'. However, for most of the last eighteen months, ATi's high end products just haven't been very good, and because of this NVidia hasn't had to release new high end products because they had the fastest cards anyway in the shape of the GeForce 8800GTX, 8800Ultra and 8800GTS.
This has been annoying because the fastest cards were still not fast enough to cope with the most demanding games at high resolutions. One game in particular - Crysis - is so demanding that even the top of the range 8800Ultra could only manage 29fps at 1920x1200. The human eye is only usually thought of as being capable of distinguishing up to about 25fps, so you would think that this is more than enough, but this figure is an average score. When there is more going on on screen, for example other characters, explosions etc, this fps score drops significantly - and the game will be unplayably jerky.
The situation haas changed in the last couple of weeks though, which is why I am deciding to finally upgrade (I had previously planned an upgrade in January - sticking to my usual two year upgrade cycle, but for once it seemed worth waiting for).
So what's changed?
That's for part 3...