Publishers of technology books, eBooks, and videos for creative people

Home > Articles > Design

  • Print
  • + Share This

Selecting the Chipset That's Best

If you've decided that it's time to replace your wheezing graphics card or comatose onboard graphics with a blazing AGP card, you have no shortage of solutions. Even though the number of high-performance graphics chip makers is smaller today than it was a few years ago, there are more cards from which to choose. Here's why:

  • nVidia, as it has from the beginning, is strictly a graphics chipset maker. Any graphics card with an nVidia chipset onboard was produced and is supported by a third-party company.

  • Matrox, as it has from the beginning, is strictly a graphics card maker that uses its own chipsets. Every graphics card with a Matrox chipset is built by Matrox.

  • Since the late 1980s, ATI traditionally used the Matrox "only we build cards from our chips" model. However, starting in late 2001, ATI began to offer some, and later virtually all of its chips, to various third-party video card makers. Starting in late 2002, ATI began to take significant market share from nVidia in the chipset business, while continuing to make ATI-brand video cards as well.

As a result of ATI's joining the chipset business while maintaining its established graphics card business, you have about as many choices using ATI chipsets as you do with nVidia chipsets—especially if you shop at "white-box" component stores or online stores.

From the standpoint of performance, the race is between ATI and nVidia as of this writing. Matrox cards have excellent 2D image quality, but their 3D gaming performance is very poor.

It's impossible in the space available in this book to give you a comprehensive rundown of the nVidia and ATI chipsets on the market. Besides, if history is any guide, both nVidia and ATI will have introduced several new chipsets by the time you read this. Thus, the next section provides you with a checklist of significant features to look for along with examples of nVidia and ATI chipsets with these features.

ON THE WEB: Try Them? We Already Did!

Wondering how TechTV rates the current chipsets? Just ask! Type in the name of a recent chipset at the TechTV web site's search engine (http://cgi.techtv.com/search) to see articles and reviews that pit today's hottest 3D chips against each other. Type "video card" (don't forget to use the quotes) for hundreds of articles. We've got you covered with lots of information!

Key Features to Look For

The features you need in a graphics card are determined by how you use your computer and the amount of money you are willing to spend. From the standpoint of graphics cards, the major types of users fall into these categories. I bet at least one of these capsule descriptions describes you.

  • High-end gamer—You need maximum 3D performance and hang the cost.

  • Mid-range gamer—Your budget demands a balance between 3D performance and reasonable costs.

  • Casual gamer—You don't live and die by computer games, but (without spending a lot of money) you need better 3D performance than the low-powered integrated graphics built into many computers.

  • Video producer—You're far more interested in grabbing and editing video than in game playing.

  • The code or text jockey—As long as you can see two displays at the same time, you're happy.

How do these shorthand descriptions translate into features? Table 13.2 sorts it out for you. See the following sections for more information.

Table 13.2 Key Graphics Card Features by User Type

User Type Feature

High-End Gamer

Mid-Range Gamer

Casual Gamer

Video Producer

Code/Text Jockey

Example Chipset(s)

DirectX 9.0 support

Yes

Yes

nVidia GeForce FX series. ATI Radeon 9800 Pro, 9600 Pro

DirectX 8.0/8.1

Yes

Yes

Yes

nVidia GeForce Ti 4xxx support. ATI Radeon 9000, 9200

64MB of RAM

Yes

Yes

Yes

nVidia GeForce 4 MX. ATI Radeon 9000

128MB or more

Yes

Yes

nVidia GeForce FX, of RAM GeForce 4 Ti. ATI Radeon 9000 Pro, 9500, 9800 Pro

Memory bus an\d memory speed

Yes

Yes

For fastest memory bus and memory speed, look for high-end cards

Graphics chipset clock speed and number of internal pipelines

Yes

Yes

Low-end and mid-range cards use slower chipsets and chipsets with smaller numbers of pipelines

TV-out

Yes

Yes

Yes

Yes

Most current nVidia and ATI chipsets

Dual CRT or CRT/ LCD display

Yes

Yes

Yes

Yes

nVidia GeForce FX, GeForce 4 Ti, MX ATI 9xxx Pro series

TV/video capture

Yes

-ATI All-in-Wonder series and nVidia Personal Cinema series

Enhanced DVD playback

Yes

Yes

Yes

Yes

Most current nVidia and ATI chipsets


DirectX Support—Smarter Shading at a Price

If you're buying a new graphics card to improve your gameplay, you want to know how powerful the 3D effects features built into the chipset are. A card with high-quality 3D effects can create almost photographic renderings, while a card with a less sophisticated range of 3D effects creates images that are less compelling.

The major 3D chipset features that determine 3D image quality include hardware transform and lighting (T&L), vertex shaders, and pixel shaders.

A quick way to separate out the best from the rest is to find out which version of Microsoft DirectX they support. Microsoft DirectX is the driver technology Microsoft uses to run 3D games in Windows. To improve 3D visual quality, Microsoft has continually improved the built-in 3D rendering features included in each new version of DirectX, and video card makers have worked with Microsoft to incorporate those improvements into their hardware.

NOTE

Hardware transform and lighting (T&L)—The process of converting a 3D object into polygons (which are made of triangles) and placing the correct lighting effects across the surface of the polygons.

Vertex—The corner of a triangle in a 3D scene.

Vertex shader—Creates texture and other effects on the vertices in a 3D scene.

Pixel shader—Creates textures and other effects on the pixels making up a 3D scene.

Table 13.3 compares the 3D features of recent DirectX versions.

Table 13.3 3D Rendering Features by DirectX Version

DirectX Version Feature

7.0

8.0

8.1

9.0

Hardware transform and lighting

Yes

Yes

Yes

Yes

Programmable vertex and pixel shader

No

Yes

Yes

Yes

Enhanced pixel shader (more instructions

No

No

Yes

Yes and easier-to-use language)

Floating point pixel shader for more precision

No

No

No

Yes

Enhanced vertex shader (more instructions

No

No

No

Yes and flow control features)


As Table 13.3 shows, graphics chipsets that support DirectX 9 offer the most sophisticated 3D rendering effects available with any DirectX version.

Most 3D games on the market today support DirectX 8.0, but as time goes on, DirectX 9 support will become widespread. If you're even a casual gamer, I recommend you look for video cards whose chipsets support at least DirectX 8. DirectX 9 support was once confined to the higher levels of the nVidia and ATI lines, but can now be found in cards under $150.

Graphics Card RAM—More is Better for Resolution and Performance

The amount of RAM on the graphics card has a big impact on two factors important for game playing:

  • Maximum screen resolution at a given color setting

  • Performance

Today's graphics cards pack amounts of memory (64MB or more) that not long ago were regarded as more than sufficient for system memory. 3D graphics cards require much more memory than the 2D graphics cards used a few years ago because the memory on the card is used for many different kinds of data, according to The Tech Report's Geoff Gasior (http://www.tech-report.com):

  • Frame buffer—The bitmap of what is seen on the screen at any given moment. You can calculate the frame buffer size by multiplying the horizontal pixels by the vertical pixels by the color depth and dividing the result by eight (the number of bits per pixel). For example, if you want to display 1,280 by 1,024 resolution at 24-bit color depth, you need almost 4MB of RAM on the video card just for the frame buffer. Here's the calculation: 1280x1024x24=31,457,280 bits; divide 31,457,280/8=3,932,160 bytes.

Both 2D and 3D cards use frame buffers, but the following memory-use factors are exclusive to 3D cards:

  • Back buffer—The amount of RAM needed to create the next frame is equal to the frame buffer.

  • Z-buffer—Stores depth information, and can be the same as the color depth or less. Most graphics cards compress the Z-buffer to save RAM.

  • 3D programs—The programmable vertex and pixel shader programs used by DirectX 8.0, 8.1, and 9.0-compatible graphics cards use memory; DirectX 8.1 and DirectX 9 programs are longer and contain more variables than DirectX 8 programs, so they need more memory.

  • Geometry data—The greater the number of polygons in a scene, the greater the amount of RAM needed to display them.

  • Textures—The amount of RAM used to store texture data varies widely. Although AGP supports the storage of texture data in main memory, using AGP card memory is much faster.

ON the web: Deep Graphics Require Lots of RAM

To discover how much RAM your graphics card needs to display the color depth, resolution, and z-buffer depth you desire, see the "Memory requirements for 3D applications" page of the 3D WinBench web site at http://www.etestinglabs.com/benchmarks/3dwinbench/d5memfor3d.asp.

Low-end cards typically have 64MB of RAM, and mid-range and high-end graphics cards feature 128MB of RAM. Although older games such as Quake III show little effect with different amounts of RAM, newer DirectX 8- and DirectX 9-compatible games benefit from 128MB of RAM or more.

ON the web: 64MB Versus 128MB at The Tech Report

Hats off to The Tech Report's Geoff Gasior for discovering the impact of graphics card memory sizing on performance. See his report at http://www.tech-report.com/etc/2002q3/graphicsmem/index.x?pg=1.

Other Chipset Design Factors

The internal design of the chipset is what specifies what level of 3D graphics the chipset can produce. However, a graphics chip vendor can create several different price levels for chips and cards with adjustments to the following:

  • Number of vertex and pixel shader pipelines in the chip core—Low-end cards have fewer pipelines than mid-range or high-end cards; more is faster.

  • Memory bus width—Low-end cards might have a 64-bit or 128-bit memory bus, while mid-range and high-end cards usually have a 256-bit memory bus; wider is faster.

  • Memory bus speed—Measured in MHz, faster is better. Some vendors make comparisons more difficult by building different models with smaller amounts of faster RAM and larger amounts of slower RAM.

  • Memory chip type—Square ball grid array (BGA) memory chips have smaller connectors and run faster, but are more expensive than the rectangular memory chips used on low-end cards.

  • Graphics chip core speed—Measured in MHz; faster is better.

  • Anti-aliasing support—Anti-aliasing smoothes the edges of on-screen objects, but it can slow down graphics display. More advanced cards offer higher-performance anti-aliasing, and low-end cards might use simpler and slower anti-aliasing technologies.

TV-Out and Dual Displays

Although multiple display support was once a rarity, all but the cheapest video cards now offer some type of multiple display support. However, make sure you understand the differences between which types of displays are supported by a particular card. In addition to the 15-pin VGA jack, today's graphics cards frequently feature one or both of the following:

  • TV-out—The TV-out port has become a staple on almost all current graphics cards. It enables you to send your video display out to a TV or VCR's S-video or composite connector. Note that many low-cost graphics cards with dual-display support provide only VGA and TV-out ports.

  • DVI—The digital video interface (DVI) supports digital LCD panels and is found on most recent mid-range and high-end graphics cards. However, because many LCD panels connect to the same 15-pin VGA port used by CRT monitors, graphics cards that have a DVI port are usually supplied with a DVI-to-VGA adapter so that dual displays can be supported even if both monitors use the VGA port.

Figure 13.4 shows the VGA, TV-out S-video, and DVI ports on the rear of a typical mid-range graphics card, along with the corresponding cable connectors.

Figure 13.4Figure 13.4 VGA, S-video TV-out, and DVI ports with matching cable connectors (inset).

Cards with TV-out or DVI ports have special driver software to enable you to select which display(s) are active.

Video Capture and Playback

Whether you want to capture video for editing and DVD creation or simply want to turn your computer into a digital video recorder so that you can watch your favorite TV shows whenever you want to, you can upgrade to specialized graphics cards that include TV and video input jacks and video capture software.

ATI makes the All-in-Wonder series (similar cards are available from some of ATI's manufacturing partners), and nVidia's manufacturing partners make the Personal Cinema series. ATI's All-in-Wonder cards are available in entry-level, mid-range, and high-end categories. The All-in-Wonder 9700 Pro shown in Figure 13.5 uses the same high-performance graphics chip used in the ATI Radeon 9700 Pro. The 9700 Pro has been replaced by an improved version known as the ATI Radeon 9800 Pro.

nVidia's Personal Cinema series is based on nVidia's entry-level graphics chipsets: the latest models use the nVidia GeForce 4 MX chips.

Unlike ATI, which removes the dual-monitor display feature to make room for the connectors shown in Figure 13.5, nVidia's Personal Cinema uses an external breakout box to support audio/video input-output, as shown in Figure 13.6.

Figure 13.5Figure 13.5 ATI's All-in-Wonder 9700 Pro combines high-performance 3D graphics with a full range of video input-output, capture, and editing features. Photo courtesy of ATI Technologies.

The ATI and nVidia products include software for video capture and editing, as well as TV recording and program scheduling. Consult the ATI and nVidia web sites for details.

Figure 13.6Figure 13.6 The nVidia Personal Cinema Digital Media Hub connects all types of home entertainment devices to the nVidia GeForce 4 MX graphics card at the heart of the nVidia Personal Cinema system.

ON the web: Video Capture and Recording Your Way

Get more information on ATI's latest All-in-Wonder cards from the ATI web site at http://www.ati.com.

Discover the latest nVidia Personal Cinema features and manufacturing partners at the nVidia web site at http://www.nvidia.com.

Getting the Information You Need to Make an Informed Choice

With so many options available for your graphics card upgrade, it can be difficult to make a decision. Because these design factors have complex effects on card performance, the best way for you to determine which cards have the best performance for your needs is to review trusted benchmarking sources. In addition to TechTV's own reviews, I use the following to evaluate graphics cards and technologies:

  • + Share This
  • 🔖 Save To Your Account

Related Resources

There are currently no related titles. Please check back later.