Compatibility worries

Well, captain in regards to your statement on " eye toward expansion" I am assembling it myself using some of the best products i can find that are within my budget. Also I am hoping to keep this for a little while longer then anyone could keep a laptop because I realize that it is impossible to future proof anything when it come's to technology, but when your a grad student working at a casino not making much money i can only do so much. I also am keeping in the mind the idea of expandability, i do realize that yes I may need ti upgrade this or that here or there. But hopefully I won't have to worry about doing something like for a few years to come...
Since the 1080P standard hasn't been even fully adopted yet, and is unlikely to change anytime in the near future, your machine stands a chance of a longer service life,than if you were building a gaming rig. The whole competitive frenzy thing drives gamers to Newegg in droves.(or perhaps "herds").

OK, the point of my whole "intervention" here was a cautionary tale against, "overbuilding".

My bottom line position is that one really solid video card would be enough for this type of usage.

Now video editing, at least the cut and slice grunt work, isn't really that graphics dependent.

The CPU is solely responsible for the rendering of the footage. Whatever involvement the VGA has, is entirely transient. Once the project is ready to be "printed" so to speak, you can take the video card out of the computer, then throw it, and the monitor away, and just hit > "Enter"<.

Now, this discussion took a bit of a strange turn when the term "olden days" came into play. In the "olden days" a 6200GS was considered " a mighty fine, rippin' fast video card, that would charge into action, bringing it's enormous 64MB of RAM to bear on the graphics task at hand. And, it's a laughable flea market part today. The 9500GT that everybody insults at every available opportunity, will outperform cards of that era by a factor of ten.

So, when so many "helpful people" pile on, and start imposing their own ultimate gaming box hardware fantasies on a machine that really needs to be only as competent as a 100 dollar Blu-Ray drive, at least with respect to video performance, I just say ,"shouldn't we think about this first". Buy the one of the planned cards now, the other one later, and only if you need it. That's my story, and I'm stickin' to it.

Anyway, here's a couple of "real workstation" VGA cards to ponder; http://www.newegg.com/Product/Produ...&N=100008333 600044687&IsNodeId=1&name=matrox
 
When you talk about CAD (or similar) software, workstation graphics are more precise and accurate, because they are optimized for vertexes and vectors, instead of textures, hence they can be considerably faster than their discrete cousins (forget about onboard stuff for this comparison).

I keep reading that the workstation GPU drivers make all the difference. Wspecially the nVidia Pro drivers. So the question is when should one actually consider getting a workstation GPU considering their huge price if ordinary gaming GPUs can do the same work...
 
You're comparing apples with oranges. Workstation cards "work" differently than desktop graphics. They are geared towards multi-monitor, large vRAM, much more stable drivers that are geared towards rendering (3D apps in particular). Workstation cards are also generally more expensive because they have 24/7 phone/tech support during their warranty period-typically three years.
 
If you are going to recommend workstation cards, then matrox isn't the one offering best price/performance/feature ratio. I think other alternatives i.e. nVidia or ATI can be of more interest.

So here are the options for ATI and nVidia. In particular if you are willing to spend any thing over 400$ mark, instead of the matrox ATI FireGL V8650 is far superior performer, on the lower end of this spectrum nVidia Quadro FX looks reasonable option (unless you want to attach 3 or more monitors).
 
If you are going to recommend workstation cards, then matrox isn't the one offering best price/performance/feature ratio. I think other alternatives i.e. nVidia or ATI can be of more interest.
I hard to imagine that my humor is subtle enough to convince someone that I was actually recommending one of those Matrox offerings, but in hindsight and given the wording, I suppose you could.

OK, this was a practical joke, http://www.newegg.com/Product/Product.aspx?Item=N82E16814106024 It has 32MB of RAM, is connected via PCI-E X 1, has a dual DVI output that requires and external splitter and costs $164.00

Basically it's from the "olden days", nothing more than a relic Newegg is stuck with.
 
@captaincranky forgive me for being naive I am pretty sure you explained it well enough for everyone to understand but I am having a hard time understanding. Are you saying a quad core CPU isn't necessary when seeking to building a gaming rig because the software/technology/drivers aren't available to take full advantage of the multi-core hardware or that its just simply useless? Also you said something along the lines of "If you have a powerful graphic card then it can't be bottle-necked by a slower CPU"! As much as I hear the term bottleneck I am beginning to hate it, but its presence seems to be undeniable! Don't laugh, but I don't have the cash to put together the system I want all at one time, and when I can afford to buy a new part then the previous parts I bought seems to be out dated! So with that being said I am running the N680i SLI (A1) Motherboard and currently one GTX 280SC with 2GB of DDR@ Crosair 800Mhz Memory, I have the 3.6Ghz P4/HT Prescott 560J running everything(This CPU has came out of my other 3 systems since 2004 and it still serves well and its all I have until I buy a new one!) Now the system can run any game at 1600x1200 everything set to high even Anti-A enabled at the highest! The issue I see is that no matter what I change my resolution to I always get the same FPS(min18-45max) even running at 640x480 in-game and also on benchmarks. Everyone I come across says it's because my CPU is bottle-necking the system! I was running a 9800GTX+ before I got the current GTX 280SC and the FPS seem to be the exact same with the GTX 280SC as I had with the 9800GTX+! So is that a bottle-necking issue and would upgrading to a quad core qx6850 improve my performance better vs e8600? I bought the motherboard because I wanted something affordable that I could keep for a while, so SLI and CPU upgrade-ability won all hands down! Please tell me if what I'am explaining is what you where referring too. Some of my roommates disagree with you and some agree! I just would like to know if what I'am explaining is what you are talking about.
 
I myself had a P4 Prescott clocked at 3GHZ and it severely bottlenecked my XFX 9800GT during Crysis.
 
@captaincranky forgive me for being naive I am pretty sure you explained it well enough for everyone to understand but I am having a hard time understanding. Are you saying a quad core CPU isn't necessary when seeking to building a gaming rig because the software/technology/drivers aren't available to take full advantage of the multi-core hardware or that its just simply useless? .

Yes, if im not mistaken what he is saying is exactly true, a lot of programs and applications as well as games can't fully use all the technology thats offered with a quad core. The hardware is coming out before the software is that can fully utilize the quad core set up. Or so i hear. Now its no reason to not get something of such high quality because in due time applications and programs will come out where it will be able to take advantage of what is being offered. But as of right now not many programs or games can fully utilize quad core technology. The reason that i am getting it is becasue in time what will be coming out with only use quad core and i just want to be prepared, but i know full well that a lot of the time it will be basically just producing heat in my room. But once vegas pr comes out with the software where i can properly use all 4 cores, i will just be one step ahead of the game.
 
@captaincranky forgive me for being naive I am pretty sure you explained it well enough for everyone to understand but I am having a hard time understanding. Are you saying a quad core CPU isn't necessary when seeking to building a gaming rig because the software/technology/drivers aren't available to take full advantage of the multi-core hardware or that its just simply useless? Also you said something along the lines of "If you have a powerful graphic card then it can't be bottle-necked by a slower CPU"! As much as I hear the term bottleneck I am beginning to hate it, but its presence seems to be undeniable! Don't laugh, but I don't have the cash to put together the system I want all at one time, and when I can afford to buy a new part then the previous parts I bought seems to be out dated! So with that being said I am running the N680i SLI (A1) Motherboard and currently one GTX 280SC with 2GB of DDR@ Crosair 800Mhz Memory, I have the 3.6Ghz P4/HT Prescott 560J running everything(This CPU has came out of my other 3 systems since 2004 and it still serves well and its all I have until I buy a new one!) Now the system can run any game at 1600x1200 everything set to high even Anti-A enabled at the highest! The issue I see is that no matter what I change my resolution to I always get the same FPS(min18-45max) even running at 640x480 in-game and also on benchmarks. Everyone I come across says it's because my CPU is bottle-necking the system! I was running a 9800GTX+ before I got the current GTX 280SC and the FPS seem to be the exact same with the GTX 280SC as I had with the 9800GTX+! So is that a bottle-necking issue and would upgrading to a quad core qx6850 improve my performance better vs e8600? I bought the motherboard because I wanted something affordable that I could keep for a while, so SLI and CPU upgrade-ability won all hands down! Please tell me if what I'am explaining is what you where referring too. Some of my roommates disagree with you and some agree! I just would like to know if what I'am explaining is what you are talking about.
The term "bottleneck" when used in a colloquial, abstract, or slang manner, directly refers to a restriction of flow. "Because of the overturned truck, traffic was "bottlenecked" down to one lane". A CPU >>generates the flow<< it can't "bottleneck" itself, it can only create insufficient flow! Now contemporary "Geek Speak" calls everything that >>lessens flow <<a "bottleneck". I disagree with many applications of the term. In your example, I would say that your CPU is inhibiting the game, creating insufficient data, not up to the task, something on that order, but not really bottlenecking it.

Now, if your CPU were capable of generating 200 FPS, and your video card was only capable of offloading 50 FPS, then that would be the true or traditional application of the term. Yep, that baby would be "bottlenecked"...!

As to the Quad vs Dual Core issue, more games seem to be able to utilize multiple cores than in just the past couple of years. Wisdom was up until now, that the higher clock speeds that the dual cores (notably the E8XXX Intel Wolfdales) were capable of generating, were more to the benefit of gaming than extra cores. Some of today's quads are approaching the clock speeds of the duals, and offer many benefits in other areas of computing as well. (CAD, Photoshop, movie editing, and transcoding). It's probably time to consider four cores as the optimum.

I'm not a gamer, but those at the forum that are, at least the more experienced, (read "more affluent there") have opted for quads if not the new hexacore issues.

On the other hand, one of our members, not two years ago, opted out of a quad for gaming, and went with a high end Intel dual core for games.

With all that being said, today even the lowest end Intel Core 13-530 will take the measure of most of the E8XXX series, with hyperthreading to boot. (Am I still allowed to use the slang phrase "to boot", without referring to starting up a computer? Gosh I hope so).

With all of that being said, why not check out this CPU report at "The Tech Report", http://techreport.com/articles.x/18448 With enough Red Bull, this should keep you and your roomies up to the very wee hours of the next day, arguing about whether, "if a CPU burns up in forrest, and nobody's there to unplug it, can you see or hear the forest fire"? (Geek existentialism).
 
I myself had a P4 Prescott clocked at 3GHZ and it severely bottlenecked my XFX 9800GT during Crysis.
No, it didn't. It just wouldn't generate sufficient information to utilize the full capacity of the video card. A "bottleneck" would be something more on the order of eating five pounds of cheese, slamming down a half dozen Percosettes, then trying to take a s***...! :haha: :wave:
 
@UM
What the cap'n is saying is this (I believe)...

Crysis_thread_usage.jpg


From iXBT's excellent seven part series regarding CPU architecture and the utilisation (or not) of multithread/multicore application.
In the above instance CPU core speed is more relevant than parallel computation (multithread) since the game is not coded or optimised to allow the workload to be executed simultaneously. Where a game allows for parallel execution a single core CPU, or indeed a dual core will "work slower" than the graphics card's ability to render textures etc.

EDIT: Sorry for jumping on your thread cc . You posted while I was uploading the jpg -didn't realise you had the job in hand...so to speak.
 
DeeBeeZee, there's no need to apologize, this is far from my thread. Although, I daresay I think I had one "shining moment" in post #35....:rolleyes:
 
@UM

In the above instance CPU core speed is more relevant than parallel computation (multithread) since the game is not coded or optimised to allow the workload to be executed simultaneously. Where a game allows for parallel execution a single core CPU, or indeed a dual core will "work slower" than the graphics card's ability to render textures etc.
That's an interesting point, and it should provide a lively debate, can software "bottleneck" a CPU. I'm tired of typing ATM, so I'm gonna stand pat with, "meh, DILLIGAF"?
 
Poorly coded or "unoptimised" games often "bottleneck" hardware. RTS and flight sims (amongst others) quite often are "CPU bound", not by core speed or thread count but by one thread that is tasked with either a large workload or poorly coded (for example) AI, enviroment rendering or physics computation (regardless of the ballyhoo regarding PhysX - all physics engines, whether GPU or CPU based, require CPU input).
There's also the case where extra threads offer better graphical quality but don't increase the metrics that are usually associated with benchmarks- namely the almighty f.p.s.

Core_speed_count.jpg
 
I took it as constipated.....oh well tomato tomato,
they are correct however, most people have 'bottleneck' 180 degrees from what it is.

Poorly coded or "unoptimised" games often "bottleneck" hardware. RTS and flight sims (amongst others) quite often are "CPU bound", not by core speed or thread count but by one thread that is tasked with either a large workload or poorly coded (for example) AI, enviroment rendering or physics computation (regardless of the ballyhoo regarding PhysX - all physics engines, whether GPU or CPU based, require CPU input).
There's also the case where extra threads offer better graphical quality but don't increase the metrics that are usually associated with benchmarks- namely the almighty f.p.s.

anyway Chef/Cap so to put this in context, using Metro 2033 as an example of a game that brings most VGA's to it knees, is the game "unoptimized" or is it creating a 'bottlneck'from the number of textures/triangles and or physics? or for that matter both?
 
The graphical features are the real killer -tessellation and depth of field impact performance severely. Running the game on AMD hardware is going to have a further penalty, since the game is optimised for GPU PhysX, which I think is offloaded (poorly.. x87 compile?) to the CPU with AMD cards.
I have the game installed on my secondary rig (HD5850's) at the moment and I wouldn't say the performance/IQ is anything to write home about. My primary rig (which is on semi-permanent loan to my nephew) also has the game loaded and runs like melted butter using the GTX 280's (with higher IQ)...but then, the game was optimised to use a dedicated thread for PhysX, so maybe with AMD GPU's, two (or more) threads are seconded to do the physics work.
I think the game could have been coded better-although that is coloured from the comments of others, but that's guesswork as I would need to see another game using the 4A engine to judge.
 
There's a bunch of different interpretations. I would offer this, software that only utilizes one thread can't "bottleneck" a CPU. The CPU outputs that which it can, generating the throughput it is capable of. That's not a bottleneck, that's a part in need of an upgrade. However, perhaps bottleneck is more appropriate in a software environment that has multi threading that the processor can't deal with. I simply don't equate a part incapable of sufficient throughput as a bottleneck, only a part that limits the output of the previous link in the chain. For example, my 50 FPS VGA installed after a 200 FPS CPU analogy. Perhaps my definition is too narrow, perhaps I like to argue, perhaps so does everybody else.

With that said, "bandwidth"is also a term slightly misapplied in certain areas. Bandwidth is only appropriate if information is being processed in parallel, in some cases, "band speed" might be a better adjective. "And back to you in the studio Red...."

"Bandwidth" is a term that was coined in conjunction with TV/radio transmission/ receiving technology. In the case of AM transmission, Bandwidth would refer to the Peak to peak differential of signal modulation, whereas, with FM transmission, "bandwidth" refers to the frequency differential of the signal, and it's relationship to interference with adjacent channels. Both statements could be true here also, if the IP were processing information from many clients simultaneously, in parallel, then bandwidth would apply, if information is only being serially dispersed the "stream speed" would be more appropriate.
 
"And back to you in the studio Red...."

okay then...Thanks Cap....hot enough out there for ya?
well I get exactly where you are coming from, this is why you are getting the feedback, I think, because this is what is largely being offered as the definition of bottleneck. It is that it is simply the slowest and most incapable component in the system
I simply don't equate a part incapable of sufficient throughput as a bottleneck, only a part that limits the output of the previous link in the chain.

http://www.youtube.com/watch?v=x6hkGiYW21s

However if you read reviews/benchmarks at Tom's for example, the 'bottleneck is the component that cannot, or is not, being used to its full potential. and its being caused by the program that is being run.

I am with you on this. to me this is a 'weak link' not a bottleneck. However Ihave to stop and count fingers to make sure which I am talking about at times, because they are different.
 
Here What I do to Stop Counting My Fingers......

I am with you on this. to me this is a 'weak link' not a bottleneck. However Ihave to stop and count fingers to make sure which I am talking about at times, because they are different.
I feel you with that. I put my free hand in my pocket while I'm online, although just not at Techspot. In a serendipitous turn of events, I'm left handed, but I still mouse with my right. :rolleyes:
 
I feel you with that. I put my free hand in my pocket while I'm online, although just not at Techspot. In a serendipitous turn of events, I'm left handed, but I still mouse with my right. :rolleyes:


....and I knew you would take a shot at that when I typed it :p:haha:


I put my free hand in my pocket while I'm online


...that have anything to do with your new monitor?
 
I'm going into "Disc Management" now. I'm going to try and rename the "G:/" drive, the "OY:/" drive.

...that have anything to do with your new monitor?
Well no, not exactly. That is on a machine not connected to the internet. When I use that, I give it my undivided attention, which means, "both hands"!
 
I'm going into "Disc Management" now. I'm going to try and rename the "G:/" drive, the "OY:/" drive.


Well no, not exactly. That is on a machine not connected to the internet. When I use that, I give it my undivided attention, which means, "both hands"!


of course, what a fool Ive been!
I just moved, maybe you can see the smoke signals from my house now.:rolleyes:
 
of course, what a fool Ive been!
I just moved, maybe you can see the smoke signals from my house now.:rolleyes:
There's an outside..? Wow, what a fool I've been...!

I'm actually afraid to go out and look, the natives have been really, really restless today.

"Not so pale puss have'm many guns"
 
Back