So you only have PCI slots and want to game?

Hello, folks. I've just finished reading this entire thread (3 years worth!), and it's been very helpful. Thanks to everyone who posts helpful info/comments to it.

My situation is a bit different from most. I have an AGP 4x slot, but it's occupied by my Wildcat Realizm 200 CAD card which I need for other purposes. I recently decided to game a bit, which requires me to install a PCI gaming-suitable card. Fortunately, my mobo allows me to select PCI or AGP graphics at startup. Here's what I've gleaned from this thread:

The best options right now are: Radeon 2400 HD; X1550; X1300. Nvidia 8400 GS; 8500 GT. (Other cards announced but not yet available in the US include Nvidia 9400 and 9500.)
All these cards come with 256 M memory, except the 8400 which can have 512. The favorite seems to be the 8500. However, there's a price penalty: as someone on here noted, you can get a new 512 M 8400 for $30 (microcenter.com; $40 with a $10 rebate, but hurry! it ends 12/28), but an 8500 will cost $100.

Two questions I've not seen definitively answered:

1. Does the advantage of twice the onboard memory of the 8400 (512 vs 256) make up for the other, lower, specs over the 8500? Or, are they close enough to justify the $70 price savings?

2. Does the data transfer bottleneck of PCI negate the advantage of the 128 over the 64 bit bus? I've seen this question answered both ways: the PCI so restricts data transfer that it doesn't really matter what the bus speed is...or (the other side), that of COURSE a twice-higher bus speed means better performance. So, which is it?

I'd like to buy the 8500, but frankly I wonder if it's worth 70$ more. I suspect I'd be very happy with the 8400 or the X1550 (I can get it for @40).

(My computer: MSI 694T; full tower; P3 Tualatin OCed 1.6G; 1.5G SDRAM; 400w power; LOTS of cooling. This system is WAY faster than the P4 2.2?G setup I use at work.)
 
I hate correcting people. Now look this, i just finish playing Crysis for an hour and 30mins and my look at my temps was at 42c.

Also, look at this, 33fps, plus 11+fps, on both photos. It quick drops however to around 9-10 after the bombing, but quickly shoots back up to 20, 30, now 33 lol. I did however put physics quality on low now, and shadow is still on lmedium. But everything else is on low, and i did not OC my card, but when i do, the game moves about 3-5% faster. I told yall, it depends on where i am at in the game, because some parts really drop, only because i have the settings on medium, the lowest it drops is 8fps, one time at 4fps lol because i blew up 2 trucks at once lol. But other then that, crysis runs ok for my rig.

http://c83.cc/images/9qal9kck99o67vzi16ip.jpg

http://c83.cc/images/op3xnk0gjv1udqv1x0f.jpg

http://c83.cc/images/z0026fh3jq3oyoo0bi.jpg

Let me show you some pictures, then I will explain why your results make no sense, after which I will accuse you of blatant lying and photoshopping of images. Deal ? Here we go.

Here we see a youtube video of someone getting something like 8-17 FPS with a Powercolor 2400Pro PCI: [ame]http://www.youtube.com/watch?v=gwgt-CWO5YE[/ame] - Granted, he is using FRAPS, but he is also using a P4 with HT at 3.06Ghz, rather than a PIII 600Mhz. He is running mostly low settings, with CPU options on high and two GPU settings on high - but he is doing it on 800x600.

Now for some of my own, these are of Crysis Warhead (My Crysis DVD isn't along with mere here) at 1280x800 on my Dell XPS M1530. All settings are at Mainstream (Medium). 1280x800 displays 30% more pixels on the screen than 1024x768. The results speak for themselves:
http://i36.photobucket.com/albums/e44/Direwolf007/ScreenShot0000.jpg
http://i36.photobucket.com/albums/e44/Direwolf007/ScreenShot0001.jpg
http://i36.photobucket.com/albums/e44/Direwolf007/ScreenShot0002.jpg
http://i36.photobucket.com/albums/e44/Direwolf007/ScreenShot0003.jpg

We notice a few things. First of all, performance ain't all that hot. Second of all, from what I gather the output format for the FPS measurement is:
<average> (<low>...<high>), where low and high are kept for a small history (I do not know how long precisely) and are rounded to the nearest whole number.

Lets remember, the CPU in this laptop is a Centrino Duo T7500 running two cores at 2.2Ghz. It is outfitted with 4GB of DDR2 RAM and uses a GeForce 8600M GT with 256MB of DDR3, a 128-bit memory interface and a PCI-E X16 interconnect.

Tha General is running a Pentium III at 600Mhz. Even assuming linear performance differences due to clock speed, the T7500 is almost four times faster, on a single core. Of course, the performance differences between the P3 architecture and the modern Core 2s aren't linear. The T7500 is more than ten times more powerful than the lowly PIII. The 8600GT DDR3 is more than three times more powerful than the HD2400Pro. The Mobile version is identical to a regular 8600GT, but is clocked slightly lower. I will give the HD2400Pro the benefit of the doubt and say it is only twice faster. Finally, the PCI bus bottlenecks even the HD2400Pro (Compare 3DMark results between PCI-E and PCI versions of cards since it is easy to google for those. The PCI-E versions are up to twice faster, due to bus limitations). We will assume the PCI bus does not bottleneck the card. This is a lot of leeway given to the PIII + HD2400Pro combo.

Let us sum it up: We display 30% more pixels and get around 18 average fps on medium settings, normalized this gives us around 23-24 avg fps at 1024x768. By using a CPU at least four times more powerful, and a GPU at least twice more powerful, it is quite safe to assume the reasonable FPS for the PIII is at most half. This is also well in line with the results I achieved with the PCI8500GT on the Via C7-D (In my sig), which is also a setup superior to the PIII.

In other words, against two different systems (and a third coming up), Tha General's claimed results make no sense. Compared with the independent youtube video, they still make no sense.

Now, to the second point. Like I said, from what I found on the web, Crysis reports FPS in the following manner:
<average> (<recent minimum> .. <recent maximum>)

This means that the average cannot be more than 0.5 fps higher than the recent maximum. This means that Tha General's image, where his system is reportedly running: "FPS 33.5 (1 .. 4)" makes absolutely zero sense. And this is the part where I accuse you, Tha General, of editing your reported images. If you look carefully at the number of reported Tris in the same image (Below is a link to a saved copy, in case he changes his), the digit 5 is cut in the middle. Add this together to my previous statement, and it is quite clear that the image has been modified.

Also, Tha General, you reported you ran the game at 1024x768, but your linked images are 1024x640. Which one is correct ?

Saved image:
http://i36.photobucket.com/albums/e44/Direwolf007/9qal9kck99o67vzi16ip.jpg
 
So i guess you think i am also lying when i said i am using a 90watts PSU, that is still working since 2000 right? HAHA

You are clearly missing the point. I do not get over 30 or even 28 all the time, this is where the confusion is coming in. I get between 13-20fps at medium settings at 1024x768, or 22fps. The 28 -35 only shows up when in certain areas in the game, where there aren't alot of struggle going on. The video i seen before, he is using a Pentium 4, with more then 2GB of ram, and wants to play at 800x600 at low lol. He can get higher then that lol. If i can play at the resolution of 1024x768, he should be able to play at 1280x1024 low to medium, using the 8.12 drivers. Hell i am not even using the 8.12, i have to use 8.3 visiontek drivers.

I didn't alter any numbers using displayinfo. The numbers appear incorrectly or whatever because i use fraps to take snaps, and sometimes when i hit the snap button(scoll locK) it doesn't , and the numbers jump around quickly because of all the action on screen sometimes, no big deal. And about 1024x768, i resize the photo to 1024x640 because it looks better. :)

If i was using a 8500GT, my performance will be much better. Even with the 8400GS, i ran at 1280x1024 everything on low, and still got 20-30. I can not show you this now, because i am not using XP and i don't have the PNY 8400GS anymore, only because the card is too big for my small tower. But i am buying the 8400GS 512MB BFG version soon for this rig.

I am kinda done with this discussion , but believe what you want, but as i said at 1024x768 low to medium i get around 10-22fps, mostly 13 or 14 to 22, as you seen in the photos. So thats pretty much playable for my rig which is remind you:

Prebuilt Gateway, Intel Pentium III/600mhz with 800Memory Bandwidth, 512MB(Dell & gateway sticks) 90watts PSU , Visiontek 2400HD 256MB PCI, which btw crysis wouldn't even play at those levels without the awesome 2400HD. Oh and did i mention that i am using a 10GB Internal HD which has been in my computer since 2000, i also have 500GB and a 250GB external HD hook up to my USB 2.0 PCI card, and i have a Sound Blaster AE 2 PCI card, all this working wonderful with my 90watts PSU.

So um believe what you want, but i game just fine with PCI cards and i will keep using them, so i am glad that folks are still making them.

Here are some shots, last shots which i am posting.

1024x768 low to medium, NO AA
http://c83.cc/images/jevw5klzfd9nzl4pok3m.jpg
http://c83.cc/images/tc067jzs8nkck1aljrc.jpg
http://c83.cc/images/bau7t7lqat6zmti3ugu.jpg

1024X768 low to medium with AAX2
http://c83.cc/images/pvkuiff9fyim05qvduqq.jpg
6.8FPS, when there are massive explosion on screen. Now that is is pretty good, because it drop from 15fps, to 6.8 even when explosion's happen


Just a couple benchmarks with other games.
COD4 doesn't seem to work good at all with my 2400HD, but works just fine with the 6200.
Jericho very smooth, at 1024x768 shader and texture on high, 13-20fps.
Timeshift, i have no tested with the 2400HD using w2k, but on xp, i use to get 20-50fps.

Anyways, take care , peace.
 
Oh and btw, bioshock images on my Pentium III. runs betwen 8-15fps. Not bad lol. Couple things to note tho, i have to turn down hardware acceleration on my sound blaster card, because if i dont, the sound starts to static up in the game. Also, i have the settings at 1024x768 actor and texture on highest, real time reflections i took off, post processing on, detail high shaders on.

http://xs134.xs.to/xs134/08515/bioshockpentium3476.jpg
http://xs134.xs.to/xs134/08515/bioshokcc5538.jpg

You can't save in the demo, so i will not be playing this anymore, until i buy it. Its only 20 bucks, so i will grab it soon. Anyways, i guess thats enough of you showing you that pci cards are great for gaming. I would prove to you that test drive unlimited runs on my rig at 1280x1024 medium to high settings, plus AAX2, but why bother lol.
 
Hello, folks. I've just finished reading this entire thread (3 years worth!), and it's been very helpful. Thanks to everyone who posts helpful info/comments to it.

My situation is a bit different from most. I have an AGP 4x slot, but it's occupied by my Wildcat Realizm 200 CAD card which I need for other purposes. I recently decided to game a bit, which requires me to install a PCI gaming-suitable card. Fortunately, my mobo allows me to select PCI or AGP graphics at startup. Here's what I've gleaned from this thread:

The best options right now are: Radeon 2400 HD; X1550; X1300. Nvidia 8400 GS; 8500 GT. (Other cards announced but not yet available in the US include Nvidia 9400 and 9500.)
All these cards come with 256 M memory, except the 8400 which can have 512. The favorite seems to be the 8500. However, there's a price penalty: as someone on here noted, you can get a new 512 M 8400 for $30 (microcenter.com; $40 with a $10 rebate, but hurry! it ends 12/28), but an 8500 will cost $100.

Two questions I've not seen definitively answered:

1. Does the advantage of twice the onboard memory of the 8400 (512 vs 256) make up for the other, lower, specs over the 8500? Or, are they close enough to justify the $70 price savings?

2. Does the data transfer bottleneck of PCI negate the advantage of the 128 over the 64 bit bus? I've seen this question answered both ways: the PCI so restricts data transfer that it doesn't really matter what the bus speed is...or (the other side), that of COURSE a twice-higher bus speed means better performance. So, which is it?

I'd like to buy the 8500, but frankly I wonder if it's worth 70$ more. I suspect I'd be very happy with the 8400 or the X1550 (I can get it for @40).

(My computer: MSI 694T; full tower; P3 Tualatin OCed 1.6G; 1.5G SDRAM; 400w power; LOTS of cooling. This system is WAY faster than the P4 2.2?G setup I use at work.)

If I were you, I'd get the 8400GS or the X1550. You have no use for DX10 anyways, and the price of the X1550 is right for you at 40$. As for your questions:

1) No. Neither the 8400GS, nor the 8500GT ever need more than 256MB of GRAM, thus the 512MB on the 8400GS does not make up for the far weaker statistics of the card itself - it is a marketing trick.

2) Not directly. The 128-bit bus is the memory interface, that is, this is the interface between the GPU and the memory. The card interface to the host computer has no direct impact on this. The indirect impact is that most cards on PCI tend to be around half the performance of their AGP/PCI-E brethren (due to the bus limitations), thus the differences between cards shrink and the performance between 64-bit and 128-bit cards shrinks accordingly. A 128-bit card is still faster than a 64-bit one on PCI, just not by as much.
 
Here is my Crysis Benchmarking.

My PC Specs are:

Dimension 2350
Windows XP SP2
P4 2.8ghz Nortwood CPU
1 Gig Ram
PNY Nvidia 8400GS 512 PCI - stock timings, no overclocking
Onboard Sound
Thermaltake 430watt PSU

Crysis Demo
800x600 - All Low Settings

I am getting 8-20fps average fps. Very similar to the Powercolor Video posted above. If i look at the ground i get more than 30fps. Amazing !

If i switch to 1024x768 i lose 2 fps. If i turn shaders to high i lose 4fps.

If i was using a 8500GT, my performance will be much better. Even with the 8400GS, i ran at 1280x1024 everything on low, and still got 20-30. I can not show you this now, because i am not using XP and i don't have the PNY 8400GS anymore, only because the card is too big for my small tower. But i am buying the 8400GS 512MB BFG version soon for this rig.

This is incorrect. At 1280x1024 with everything on low i average 5fps. The game is not playable at this resolution.

800x600 at low or possibly 1024x768 are the only playable resolutions/modes possible. And thats on a P4 2.8 ghz cpu!

Some Screenshots:

800x600 Everything on Low. No AA



1024x768 Everything on Low. No AA



1280x1024 Everything on Low. No AA




I hope this puts the whole Crysis debate to rest.
 
Can you do some benchmarks on Cod4 on the lowest settings and lots of other games on lowest detail possible? I've finally met a person with a very similar processor as me..PCIGamer?
 
Here is my Crysis Benchmarking.

My PC Specs are:

Dimension 2350
Windows XP SP2
P4 2.8ghz Nortwood CPU
1 Gig Ram
PNY Nvidia 8400GS 512 PCI - stock timings, no overclocking
Onboard Sound
Thermaltake 430watt PSU

Crysis Demo
800x600 - All Low Settings

I am getting 8-20fps average fps. Very similar to the Powercolor Video posted above. If i look at the ground i get more than 30fps. Amazing !

If i switch to 1024x768 i lose 2 fps. If i turn shaders to high i lose 4fps.

This is incorrect. At 1280x1024 with everything on low i average 5fps. The game is not playable at this resolution.

800x600 at low or possibly 1024x768 are the only playable resolutions/modes possible. And thats on a P4 2.8 ghz cpu!

Some Screenshots:

800x600 Everything on Low. No AA

http://c83.cc/images/72b1bcfqjdsvjm5e70e4_thumb.jpg[/IMG]

1024x768 Everything on Low. No AA

http://c83.cc/images/t8p2aidajyxk7nceltm_thumb.jpg[/URL]

1280x1024 Everything on Low. No AA

[URL=http://c83.cc/viewer.php?file=ktmj5jj8aa7juuc0ybyi.jpg]http://c83.cc/images/ktmj5jj8aa7juuc0ybyi_thumb.jpg[/URL]


I hope this puts the whole Crysis debate to rest.[/quote]

So you get 8-20fps , on low settings at 800x600 - 1024x678, but using 1280x1024 you get 5 to 10fps? So if you get 8-20 and 30+ ( and ------> below 40 looking at the ground, and putting the settings above low, you lose 4-5fps ) so you are getting between 5 to say 35fps total on " those settings " overall using a 8400GS and a Intel Pentium 4.

Interesting, but something is clearly wrong there. I am going to do the same thing , same settings as you , and post results later.. I never played the game at 800x600 before lol, but i will try it. Infact, i hate putting the game on low, so no telling how good the performance will be.
I have no idea why you are getting around 14fps in those shots, using a intel Pentium 4. Unless something is up with the PNY 8400GS Series, its possible that the 2400HD is more powerful. The powercolor version has a lower core and memory clock and texture and pixel fillrate then the visiontek 2400HD. If you didn't know.

The pNY 8400gs when i had it, the core was fine , but the memory was too low at 333, would of been better if they up it to 700 something. I can confirm that the 8400GS is more powerful then the 2400HD, because it play Crysis better.
But see another thing, when i had the 8400GS i was using XP and the OS was too demanding, and game performance was terrible. But now that i am using w2k, its easy on my computer, infact the OS is perfect for my 600mhz P3, so that is one reason why i am able to play crysis and other games better. So when i do get my 8400GS BFG card, game performance in alot of games should double, over the 2400HD card.

Anyways, nice benchmarks.
 
Rumor Killer Arrives...

Alright. My benchmarking session of the PCI8500GT is complete. To sum it up quickly: Tha General's results make no sense, the results are in line with previous findings and other results off the web and we can bury this discussion.

The results are at my blog.

I'll post a small part of it here:

"Test Setup:
CPU: Intel E5200 2.5Ghz (Overclocked to 3.8Ghz)
Cooling: Thermalright IFX-14
Motherboard: Abit I-N73HD (My P45 MSI board went up in flames - quite literally)
Memory: A-Data Vitesta 2 x 2Gb DDR2 800Mhz
Hard Drives: 2 x WD1600AAJS RAID0 and 2 x WD5000AAKS RAID0
PSU: HEC Cougar 750W
Case: Nzxt Tempest
GPU: Sparkle 8500GT PCI 256MG DDR2
Drivers: Windows Vista x64 178.24

Before we get to the results I would like to clarify:
First, these results represent the absolute maximum this card card can attain with
the present drivers without overclocking (Anyone else, including, but not limited to
Tha General, saying otherwise is flat out lying). This is because during the entire
benchmarking sessions CPU usage has never reached over 65% (So the GPU
was struggling along here, and the CPU was feeding it with all the data it could
process) and because I have no other devices active on the PCI bus (And thus
there is nothing else competing for the limited bus bandwidth).

Second, these results were achieved in each gaming title with all settings at their
absolute lowest at a resolution of 1024x768. Crysis was run in DX9 mode."
 
Alright before i continue, lets keep everything cool. This if fun tho, benchmarking PCI cards. As a hardcore loyal PCI gamer, i am having fun.

Alright. My benchmarking session of the PCI8500GT is complete. To sum it up quickly: Tha General's results make no sense, the results are in line with previous findings and other results off the web and we can bury this discussion.
The results are at my blog.

If i am reading this correctly, you get 40fps in crysis at low? Thats pretty good. You get 100fps in COD4, but you are playing at too low of a resolution man. I don't understand any of that. Awesome video cards tho , i was thinking about switching over to PCIE and buy the Diamond 1GB 4670 card for my secondary rig, but i am just going to stick with using PCI. So i will buy the Albatron 8500GT or 8600GT or Sparkle 8500GT for my secondary rig.

You'll see a 20% increase in fps at most.
I hope so. Because when i had the PNY 8400GS, it didn't perform too good. I think its because of the drivers which i was using, also because XP was too much for my computer. I will find out in a couple weeks :)

.............................................................................

Alright for the person who said, PCI Gamer, you said that 1280x1024 was not playable. But the funny thing, at 1280x1024 texture on medium, everything else on low, the game plays smooth lol. It even plays better then at 1024x768, and at 800x600 ugh, plays just fine, but kinda drags a bit, new cards at low resolutions doesn't work out too good. There is also a discussion over at another forum which i use to post on, where people notice using higher resolutions, gives out better performance. This trick only works with certain games, and Crysis seems to be one of them.

Anyways, here are some benchmarks and proof that i ran the game at 1280x1024, texture on medium, everything else on low.

Benchmark testing using Catalyst 8.4, Visiontek 2400HD 256MB 64BIT PCI card, no Overclocking, Stock settings, using WIndows 2000SP4, Intel Pentium III/600mhz, with 800 Memory Bandwidth, 90watts PSU, 500GB external, 250 External , and 10GB Internal HD.

1280x1024, Texture settings on Medium, everything else on low
[ame]http://www.youtube.com/watch?v=ITDT7xvb9_0[/ame]

As you see in the video, i was getting around 9-10fps, and it drops to 3-5fps when recording.

800x600 at low 15-32fps
http://www.filecram.com/files/Crysis8.jpg
http://www.filecram.com/files/Crysis81.jpg
http://www.filecram.com/files/Crysis82.jpg

1280X1024 settings all Low, textures on medium



This is when i blew up a car, and look at the fps, 6-8fps. And remember this is at 1280x1024 texture on medium, everything else on low.

http://c83.cc/images/fbjtxqghscf4ib7pto5q.jpg
http://c83.cc/images/rzw60dbuvl4od4g7vf.jpg
http://c83.cc/images/l8htj6962i1vdwarvnii.jpg
http://c83.cc/images/drxr1jxaozldegwnaqrz.jpg

And there you go! Ever since i switch over to w2k, gaming has been almost perfect. Its light on the CPU, and unlike XP, doesn't seem to bottleneck anymore in gaming, there is slowdown, but still , very smooth performance using w2k. I never could play Crysis or any of my games on xp like i can on w2k, because XP was too demanding, but with W2K its a whole different story.

Also, i am starting to wonder how good a Pentium III is, man. I plan to buy a secondary computer very soon, with Vista32 home basic, 2-3GB of DDR2 ram, and the CPU is a gateway/Intel Pentium Dual Core with 2.4ghz. I can't wait to get it and see how good the performance is. I am sticking with the Pentium's and gateway computers. :)
 
Best PCI card for $

I'm continuing my posts regarding my efforts to find the best PCI-interface video card, at specific price levels. After reading dozens of reviews, the pricing history of these cards, and this (and several other) forums, here is my assessment so far:

Radeon: X1300 and X1550: the 1550 is a souped up version of the 1300. When the 1550 was first released, it was overpriced and thus disparaged. Now, however, that the prices of both cards are roughly the same, the 1550's small improvements favor it.
The HD 2400, though a powerful card, was designed (as is all the HD line) to serve 2 purposes: to show high-definition video and to game. These are conflicting purposes to some degree, and the HD line thus falls short in some of the specs suited to the best game cards.
Radeon winner: X1550. (I've looked at the HIS, Diamond, and Visiontek versions, and so far the Visiontek comes out ahead. All these cards can be had at online sales or Ebay, etc., for @ $40.)

Nvidia: The 2 best cards now available in the US are the 8400 GS and the 8500 GT. Nvidia's GT line is always better then its GS line, so the 8500 is the clear winner. However, the 8500 is $90 (at, e.g., geeks.com), while the 8400 can be had for $40, including shipping.
Nvidia winner: if price is important, the 8400; if maximum play is, the 8500.

(A note: I understand that some of us have special PCI needs. However, paying $100 for a graphics card based on outdated technology seems a misguided expenditure to me, when for about 200, you could upgrade your mobo/memory/CPU.)

But, I'm still wondering about the X1550 vs the 8400 GS. I've looked at the specs side by side, but frankly I don't know enough about them (vertices, pipelines, shaders, yadda yadda) to be able to judge between the cards, based just on specs, anyway. I understand that specs alone aren't enough, that it takes the benchmarking you'all are always doing to know for sure.

However, if anyone can compare these 2 $40 cards and tell me which (the 1550 or the 8400) is clearly better based on specs, please feel free.

Here's a useful review site that rates the 8400 GS (albeit the PCIe version) as best "entry level card", but also see a sidebar at that site for a list of other sites with detailed video card reviews: w[oo]w[oo]w[oo][dot]consumersearch[dot]com/video-cards. (I can't post links, so figure it out.)
 
I'm just wondering when pci vga's will disappear completely, so we can finally make a thread about agp cards and gaming.
 
But, I'm still wondering about the X1550 vs the 8400 GS.
However, if anyone can compare these 2 $40 cards and tell me which (the 1550 or the 8400) is clearly better based on specs, please feel free.

The best pci card on the market is the 9500GT/9400GT, but they are not out for sale yet. The 8600GT and 8500GT albatron is the second best out, but they are not in the united states, well maybe , but i am still looking into that. The 3rd best PCI card is the sparkle 8500GT and the forth is the 8400GS and the 2400HD line of cards.

You ask what is better between the X1550 vs the 8400gs, well take a look:

http://www.gpureview.com/show_cards.php?card1=576&card2=506

Clearly the 8400GS Is more powerful, way more powerful.

I would go with the 8400GS and try it out for youself and see how it goes.

I'm just wondering when pci vga's will disappear completely, so we can finally make a thread about agp cards and gaming.
Joke? lol. Amd is suppose to be releasing the 3450 with higher core and memory clock then the PCIE version, thats all i know of right now. I hope they keep making them for a while. :)
 
Alright before i continue, lets keep everything cool. This if fun tho, benchmarking PCI cards. As a hardcore loyal PCI gamer, i am having fun.

You might be having fun, but you're not making any sense. I might sound harsh at times, but you are misleading people with either fabricated or misinterpreted results - something I find insulting.

If i am reading this correctly, you get 40fps in crysis at low? Thats pretty good. You get 100fps in COD4, but you are playing at too low of a resolution man. I don't understand any of that. Awesome video cards tho , i was thinking about switching over to PCIE and buy the Diamond 1GB 4670 card for my secondary rig, but i am just going to stick with using PCI. So i will buy the Albatron 8500GT or 8600GT or Sparkle 8500GT for my secondary rig.

You definitely don't understand. Each result has three bars, one for the average FPS, one for the max FPS, and one for the minimum FPS. The 100fps in CoD4 is the maximum FPS. That only happens when you stare at a wall. Now look at the average FPS: The average is below 20 fps and it reaches 20 fps only at all-low settings with a res of 640x480 - this renders the game completely un-playable, unless you like your FPS games to look like (ugly - the graphics are freaking horrible at that point) powerpoint slideshows (which you apparently do - nothing wrong about that. Just remember that the universal "playable" FPS is considered to be the bare minimum of a 24-30 fps average, and plenty of people consider it to be higher).

About playing at too low a resolution - what the heck are you talking about ? It it unplayable at those resolutions. I go any higher and it'll crawl to a stand-still.

If you get a PCI video card for your secondary rig, you are doing the stupidest thing ever. The 8500GT for PCI costs more than a HD4670 card for the PCI-E, and the latter will net you far, far, far, far better performance.

Did you even read my review ? I am using a massively overclocked Wolfdale CPU and I cannot play anything with that card at anything near acceptable framerates on the lowest settings !

I hope so. Because when i had the PNY 8400GS, it didn't perform too good. I think its because of the drivers which i was using, also because XP was too much for my computer. I will find out in a couple weeks :)

.............................................................................

Alright for the person who said, PCI Gamer, you said that 1280x1024 was not playable. But the funny thing, at 1280x1024 texture on medium, everything else on low, the game plays smooth lol. It even plays better then at 1024x768, and at 800x600 ugh, plays just fine, but kinda drags a bit, new cards at low resolutions doesn't work out too good. There is also a discussion over at another forum which i use to post on, where people notice using higher resolutions, gives out better performance. This trick only works with certain games, and Crysis seems to be one of them.

Anyways, here are some benchmarks and proof that i ran the game at 1280x1024, texture on medium, everything else on low.

Benchmark testing using Catalyst 8.4, Visiontek 2400HD 256MB 64BIT PCI card, no Overclocking, Stock settings, using WIndows 2000SP4, Intel Pentium III/600mhz, with 800 Memory Bandwidth, 90watts PSU, 500GB external, 250 External , and 10GB Internal HD.

1280x1024, Texture settings on Medium, everything else on low
http://www.youtube.com/watch?v=ITDT7xvb9_0

As you see in the video, i was getting around 9-10fps, and it drops to 3-5fps when recording.

800x600 at low 15-32fps
http://www.filecram.com/files/Crysis8.jpg
http://www.filecram.com/files/Crysis81.jpg
http://www.filecram.com/files/Crysis82.jpg

1280X1024 settings all Low, textures on medium



This is when i blew up a car, and look at the fps, 6-8fps. And remember this is at 1280x1024 texture on medium, everything else on low.

http://c83.cc/images/fbjtxqghscf4ib7pto5q.jpg
http://c83.cc/images/rzw60dbuvl4od4g7vf.jpg
http://c83.cc/images/l8htj6962i1vdwarvnii.jpg
http://c83.cc/images/drxr1jxaozldegwnaqrz.jpg

And there you go! Ever since i switch over to w2k, gaming has been almost perfect. Its light on the CPU, and unlike XP, doesn't seem to bottleneck anymore in gaming, there is slowdown, but still , very smooth performance using w2k. I never could play Crysis or any of my games on xp like i can on w2k, because XP was too demanding, but with W2K its a whole different story.

Also, i am starting to wonder how good a Pentium III is, man. I plan to buy a secondary computer very soon, with Vista32 home basic, 2-3GB of DDR2 ram, and the CPU is a gateway/Intel Pentium Dual Core with 2.4ghz. I can't wait to get it and see how good the performance is. I am sticking with the Pentium's and gateway computers. :)

The only thing those benchmarks show, is that at selective times of gameplay, you can get an un-playable FPS, and most of the time you average in the low 10s, if you even get to 10fps and the moment the slightest graphics intensive moment arrives, you drop to below 7fps.

Face it (yes, yes, the reality):
1) The P3 stinks.
2) Your game performance is a joke (a horrid one, to boot).
3) You never got the harped about 20-28 average fps you claimed to get - Because if you did, your P3 just outperformed a system with: A CPU with more than ten times the computing power, more than four times the RAM and a GPU with twice the memory bandwidth - Which as any sensible person would see makes an absolute zero of common sense.

PCI video cards have their place, but in reality, they absolutely suck for gaming. Stop telling people they will do miracles their systems - They won't.
 
COD4 Video on my rig

[ame]http://nl.youtube.com/watch?v=-GOMjdh3Hjg[/ame]

Getting 7-9 fps here. It improves after the firefight to 20-25fps which is considered playable. Note: 800x600 resolution, all settings low, 11khz sound and texture settings on 'Auto'

compare it to this video which is a P4 2.6hGhz with a 7300GT PCI-E

[ame]http://nl.youtube.com/watch?v=qlmWqemzuVA[/ame]
 
I think direwolf has a few points. Looking forward to buying another pci card isnt smart. Although you say there will be a release of a pci card with high clock rates than a pci e card very similar too. You're talking about 16 lanes on pci e compared to the 1 pci lane. Its like day and night.

Direwolf, you also mention that 24fps isnt accepted by some. That's definately true. Sure 24 fps is great...if you play mine sweeper. Anything lower than 50-60 fps I can notice.

The General your reasonings are backwards progressive. Maybe if you were playing Quake 1 or Wolfenstein then I can see it acceptable. But you're using the oldest pc hardware technology for benching on the newest games. The resolution you had chosen(idunno, might as well be 50x45) Because either way its still unplayable. You do realize that there are mobos with onboard video that perform better than the card you're using and also some even cheaper than buying the best pci card available.

I remember when this thread started it was about helpful tips on pci cards. The issue here is that 99% of the people coming to this thread arent playing crysis or cod4 . They are very casual gamers, with sims, tycoon games or what have you.
 
w[oo]w[oo][dot]consumersearch[dot]com/video-cards. (I can't post links, so figure it out.)

Finally looked at this comparison and the ati card would come out on top, not the 8400. In most situations, well just about all. Since you're using a pci card in a pci E world the extra memory bandwidth will give you the advantage, as clock speed doesnt mean much when your card is too slow to read textures at a fast rate anyway, just because of the interface.
 
Sparkle put their 9400GT on their PCI list,

http://www.sparkle.com.tw/product.asp?id=94

1GB 9400GT PCI, with 128? WOW, i want one. And i will be sure to buy one when they are release. However, Sparkle needs to fix thier images, its a PCIE card from what i can see.

I think direwolf has a few points.
No, he has a difference of opinion.

Looking forward to buying another pci card isnt smart. Although you say there will be a release of a pci card with high clock rates than a pci e card very similar too.
I see nothing wrong with PCI cards and i game just fine with my games at pretty good settings , without any hassle. So i like buying PCI cards and i plan to stick with them. When PCI cards finally stop being created, i will move on to PCIE, first card being the Diamond 1GB 4670 :)

Also, AMD Is suppose to be releasing the 3450 PCI version, they called it the " PCI Solution ", its suppose to ship with higher core and memory clock then the PCIe version.

On my personal note, i believe they will also make a 4 Series card, because alot of these 4 series cards does not require a power connector and if you look at them, they are pretty cozy to use.

Direwolf, you also mention that 24fps isnt accepted by some. That's definately true. Sure 24 fps is great...if you play mine sweeper. Anything lower than 50-60 fps I can notice.

I get 15fps in Jericho about 80% of the time, plays just fine on my end, very smooth and fast, little bit of slowdown, but i like it :)

The General your reasonings are backwards progressive. Maybe if you were playing Quake 1 or Wolfenstein then I can see it acceptable. But you're using the oldest pc hardware technology for benching on the newest games. The resolution you had chosen(idunno, might as well be 50x45) Because either way its still unplayable. You do realize that there are mobos with onboard video that perform better than the card you're using and also some even cheaper than buying the best pci card available. .

Alright my reasoning is not backwards, i have a 90watts PSU, and i play Crysis at 1280x1024 texture on medium, everything else on low, and still get near 15fps. Now besides crysis, i can game just fine, from games from the past up until late 2008. All because a computer is old, doesn't make it weak.

I already have proven to you, but dire think i am editing photos, which is pretty much impossible to do, yet alone i don't use photo crap shop.

Here are some bioshock photos: Test: 2400HD/ Actor and Texture settings at Highest settings, Post Processing high, Detail Shaders On high, everything else off or low, 7-15fps. No benchmarks numbers.

http://i42.tinypic.com/ercppl.jpg
http://i39.tinypic.com/1256n1c.jpg
http://i44.tinypic.com/zwawsi.jpg
http://i43.tinypic.com/5yyyig.jpg

Tomb Raider , AAX2, Max settings, Full Screen Direct X9 effects, 15-30fps

http://www.filecram.com/files/TOM2.jpg
http://www.filecram.com/files/TOM1.jpg
http://www.filecram.com/files/TOM.jpg

This was tested with my 6200 card, see the core and memory stock settings and temps in the upper left corner.

Here are some infernal and Jerchio shots

http://c83.cc/images/roefjer02pm8agr04wc.jpg
http://c83.cc/images/uuvcykufqm9gi6p0l7v2.jpg
http://c83.cc/images/gwwmnhcojjyeognoawj.jpg
http://c83.cc/images/agqfkglhlts8hf19683w.jpg

People think i am lying, but i am not misleading anyone haha. I can game just fine at decent settings. And when i buy my secondary rig the Gateway/ Intel Pentium Dual Core with the 8500GT albatron or 8600gt version, i should game even better.

Anyways, its no point in keep trying to prove to people, hell i posted a video of me playing cysis at 1280X1024 and someone still doesn't believe me lol.

So i am just going to keep the subject from benchmarking and all. I like PCI cards, and thats that. Cheers!
 
Best PCI

Finally looked at this comparison and the ati card would come out on top, not the 8400. In most situations, well just about all. Since you're using a pci card in a pci E world the extra memory bandwidth will give you the advantage, as clock speed doesnt mean much when your card is too slow to read textures at a fast rate anyway, just because of the interface.

Thanks for the useful response. (Note, please, General: it does me no good to go to GPUreview and look at specs, since I don't know enough about the specs to know which are better or which are important.) It's true that the 8400 has better specs in some areas, but it also has a 128 bit bus, while the 8400 has a 64 bit. But, I welcome further takes on this comparison.
 
I have learn that a 64bit bus means nothing , my 6200 can game well. One question tho, which brand of X1550 are you talking about?
The only X1550 card that has a 128bit is the Diamond. Which i have been trying to find for a cheap price for about 2 years now, but everywhere i look its over 130 dollars.
 
I could be wrong on this, please confirm.

The Visiontek website is a little vague about this:

visiontek.com/teksupport/gpuspecs/X1550%20final.pdf. I can't tell if it's 128 or 64 from their overhyped language, but GPUreview says that it's 128 bit.

However.....I just checked back and note that Teklord said that the 128-bit spec given on GPUreview for the X1300 was an error, and that card was only 128 bit. Since the X1550 is just a souped up X1300, it's possible that there's an error in its specs, as well?

That means that none of the candidates I listed in my earlier post (X1300, X1550, 2400, 8400) are 128-bit, except for the 8500GT?

However.....I just checked back in this forum and note that Teklord said that the 128-bit spec given on GPUreview for the X1300 was an error, and that card was only 64 bit. Since the X1550 is just a souped up X1300, it's possible that there's an error in its specs, as well?

That means that none of the candidates I listed in my earlier post (X1300, X1550, HD2400, 8400GS) as "best PCI now available" are 128-bit, except for the 8500GT?
 
It was mention before in this forum, that the visiontek and the his and the rest of the X1550 cards are 64bit. The only true 128bit and fastest X1550 is the Diamond version. Which is kinda hard to find. If you do find it at online sites, its way overprice. I had no luck finding it.

But Pioikit, 64bit is not bad believe me. My 6200 card performance in games are good. Yea the extra bit might help, but i don't believe by much.

The good thing about the X1550, because i am buying the HIS one myself as a backup ATI card for this rig. The good features.

Has Shader Model 3.0, support for directX9, the GPU bandwidth is way higher then the 8400GS, its up to 12.8GB, and the core and memory clock is even higher then the 2400HD.
SO its a pretty good card, i heard mix reviews on the 8400GS. Some don't have any trouble with it, and some do.

I had the PNY 8400GS before, but i had to take it back , because i have a small tower and the card is too big. As for performance, it plays crysis only about 4-5% better then the 2400HD.
It worked ok, but i didn't have it long enough, and besides it seem like my computer couldn't handle the card, which kill performance a bit.

The X1550 is a great card, but i will buy the HIS version, seems to be the best version next to the Diamond Version. I am still on the hunt to find the Diamond version tho lol at a cheaper price.
 
Back