Compatibility worries

Hello i am currently looking into build my own computer for HD video editing. I am looking at putting two vision tek 4850 graphics cards in which are cross fire x support ready, and is this compatible with and EVGA Intel X58 ATX X58 SLI LE Intel Motherboard? If not, any suggestions on mobo would be swell, i would like to keep the graphics cards i have selected though.
 
Thanks crunchie, that's what I had thought but I wasn't positive so i wanted to be sure. But if anyone has any suggestions on what mobo i should opt with, within a reasonable price range, I'm all ears.
 
I will be running an i7 930 Bloomfield 2.8 GHz quad core processor. I would like to stay under 200 dollars if that's even possible. It also need to be crossfire x ready which I'm sure you already know. I have found a mobo, its an EVGA X58 FTW3 132-GT-E768-KR LGA 1366 SATA 6Gb/s USB 3.0 ATX Intel Motherboard, unfortunately its a little more then i was hoping to spend. Unless I can find another one which pack similar qualities just at a lower price I will just have to wait a little while longer until I am able to attain this. (I apologize for not being about to post links to the mobos and gpu's I've selected.)
 
That is the problem with going Intel at this time. $$$$'s. Unless you are going to actually use everything that the i7 offers, the 6 core AMD systems are really good value.
 
I'm pretty sour toward AMD Phenoms because that's what powers my laptop and I'm not not happy with it one bit. Like i suppose that it gets the job done,and laptop versus desktop are comparing apples and cashews but that one back experience turned me away indefinitely.
 
Just out of curiosity, what particular bad experience have you had with AMD Phenoms? It might serve as an eye-opener for us newbies who wants to build their own PCs for video editing use.
 
Having just glanced at this thread, it seems that some people are under the mistaken assumption that the EVGA boards (the LE in particular) don't support CrossfireX.
They do, but since EVGA is an nvidia partner you wont find the Crossfire compatibility publicised for obvious reasons.

As a point of interest...ALL X58 motherboards are Crossfire capable, while some budget X58 boards are marketed as non-SLI only because the manufacturer has saved a few dollars by not purchasing an SLI licence for a particular model.
Non-SLI boards are:
MSI X58 Platinum (there is a seperate X58 Platinum SLI model)
Asus P6T SE
ECS X58B-A2
Foxconn Flaming Blade GTI
Gigabyte EX58-UD3R (requires a BIOS update to enable SLI if the board is an early release model)
 
AMD Turion Duel Core @ 2.0 GHz is in my laptop, yes i realize its older and it is a laptop but i just dont like it. It doesn't have the speed that I need to do even the most simple of tasks, e.g., copying files, opening and running different programs, and when it comes to iTunes, it mails miserably. Yes i do realize that the new Phenom cores are leaps and bounds ahead of what i have in my Compaq (yes, it could be part of the problem) it just doesn't cut it for me when i compare it to even an Intel that was released around the same time... And dividebyzero glad to hear! You just saved me about 100 dollars
 
I think THIS would be an awesome purchase at a little over $200. However, there are plenty of boards that support CF under $200 as well. Take a look at newegg.com.
 
I don't know if this is a question or a reply, but why would you actually need all that GPU potency, when video >> EDITING & RENDERING<< are done with the CPU. After that, you can play a high def movie with current integrated graphics.

Adobe likes to run it's mouth that it's the only software house that can program to current standards, and your graphics drivers have to be up to date, but that just seems like an excuse to me, for all the crashes and bugs their software seems to exhibit.
 
captaincranky said:
I don't know if this is a question or a reply, but why would you actually need all that GPU potency, when video >> EDITING & RENDERING<< are done with the CPU.

Hi Cap. I wasn't very aware of that. Are 3D modelling, Autocad designing, etc. also more dependent on CPU than GPU? If so, why does Autocad recommend a Workstation Class GPU?
 
I'm thinking that the GPU only need be capable of the latest T & L plus shader models (Open GL) in order to properly display your work.

If you think about it, a video card doesn't possess the ability to save to the HDD, so why would we think that any thing other than the CPU is doing the editing and rendering. When you come down to it, a 9500GT is tentatively a work station quality VGA, at least if you're not a game designer.

Part of this comes down to the badly used/ overused/ misused term, "bottle necking.

A very powerful video card can't be "bottle necked" by a slow CPU. The CPU simply can't deliver all the information that the VGA could potentially handle.

If the CPU delivers 200FPS and the VGA only can deal with 60FPS, then that's an correct application of the term "bottlenecking".

Hi-Def video is 60 FPS @ 1920 X 1080, so you need a good video card, but not the very best to reproduce it. It makes sense the if the video card will run Blu-Ray well, that should be all that's needed. Now, if you need to pump this to multiple monitors, then I suppose that the requirements would go up.
 
3D modelling applications can take advantage of available graphic hardware, and would considerably improve performance. I think I have quoted on these forums somewhere that, in the olden days, an topographic geological survey drawing in autocad would take much longer to open on an integrated graphics, but even back then discrete graphic cards outperformed them with reasonable margins.
 
3D modelling applications can take advantage of available graphic hardware, and would considerably improve performance. I think I have quoted on these forums somewhere that, in the olden days, an topographic geological survey drawing in autocad would take much longer to open on an integrated graphics, but even back then discrete graphic cards outperformed them with reasonable margins.
With that being said, the CPU is still responsible for, "doing the math", as it were, in the creation of the drawing itself. And what is a movie, if not a lifelike drawing.

I do think it would be prudent to consider the actual hardware capability of the " olden days", before you compare it to the hardware capability of today, . What you'll find is specs that are laughable, 800Mhz Pentium 3 CPUs, 32 MB VRAM and on. Yet, they made movies (SD) with it. In fact, I d***ked around with Photoshop 5.5 in college. The discreet video cards were so slow, that half the time, they didn't redraw the screen after an edit.

Now, the only reason I bothered to enter this thread was to suggest that a massive outlay of cash in crossfire, (or SLI), video cards wasn't necessary, to play Blu-Ray, which is what a movie editor has to do. Hey, buy one good one first, and a SLI capable board, get the second one if you need it.

You don't need 300 FPS to play COD4 either, yet people publish "wowee gotta have one", specs of machines that will do it. Maybe it's to generate traffic on their site, and humor us "enthusiasts" .
 
I only concentrated on bit related to CAD and the bottleneck was the onbaord graphics solution which would lag behind the CPU, and yes I agree lots of math is done on the CPU for the time being.

I never had much interest in photoshop, so I am unaware about its hardware acceleration characteristics. I think for video editing even a low end ATI or nVidia card would suffice.
 
Well with the higher quality video cards, eventually i will most likely be upgrading to two video cards i was under then impression they were a necessity regardless. Maybe not two but then having duel monitors would require two graphics cards. I suppose I am just going off the system requirement that were listed on the PVR i am going to get. I didn't just touch the bare minimum, in some areas i plan on it being much better then what is called for which is another reason I am leaning toward two graphics cards. I could have never thought that i would have been able to not only play HD video but also edit it with integrated graphics. That is what my sisters laptop is and it can't even watch a standard def video on iTunes without laging out
 
You can build with your rig with an eye toward expansion, or spend all your money now at one time, I wish you well. On the other hand, I don't give a an aeronautical intercourse into a rotating pastry how you spend your money.

What I enjoy here most at TS is the BS.

As soon as anybody starts posting about "video editing" the first thing everybody does is to suggest a quad core CPU, "because video editing programs use all four cores". That having been settled at the outset, I guess it's necessary to "discuss" the need for video processing power sufficient to light up the screens in times square.

And James, you would do well not to use an aging laptop as a reference point to over think your graphics needs.

Even the desktop version of Intel's GMA 4500, will pump hi-def recorded TV out without a glitch.

I love math, at least simple math, that is. Here goes. At 2500 X 1650 (30" monitor) that's 4.080,000 pixels. At 1920 X 1080 that's 2.073.600 pixels, or roughly half. Now, the reason that some well funded enthusiasts are installing extreme high end SLI systems, is to play games at the higher resolution.

Running dual monitors on Intel i3-530 integrated graphics, it is suggested that both screens be set to 1368 X something or other, (720 P) to avoid damage.

So, if you subtract the lower number from the higher, you'll figger out the answer is somewhere in the middle, with respect to how much you need to spend on VGA, to get your project off the ground.
 
I only concentrated on bit related to CAD and the bottleneck was the onbaord graphics solution which would lag behind the CPU, and yes I agree lots of math is done on the CPU for the time being.
Here's that bottleneck term I abhor so much. OK, since the monitor's refresh rate is 60Hz, with current graphics cards pumping out 200+FPS, isn't that the real "bottleneck"? And don't even get me started about. "persistence of vision", or the fact that feature movies are run @ 24 FPS, which is sufficient to eliminate "flicker". So, at the end of the day, our brain is the real bottleneck.

I never had much interest in photoshop, so I am unaware about its hardware acceleration characteristics. I think for video editing even a low end ATI or nVidia card would suffice.
. That's what I'm sayin', half now, half later, and only if you need it.
 
Just to confirm, you have seen Keanu Reeves and Colin Farrell "acting" ?
Indeed I have! All they need do to further "hone" their craft, is "hook up" with Tara Reid, (on camera), and they'll find their comfort zone, so to speak. But can you imagine the nightmare in post, trying to put Tara's girls back up where they belong, and get rid of the implant scars?

Aftertaste, er, I mean afterthought: If you received an "Oscar"for a porn performance, would that then actually be called a "Johnson"?
 
Well, captain in regards to your statement on " eye toward expansion" I am assembling it myself using some of the best products i can find that are within my budget. Also I am hoping to keep this for a little while longer then anyone could keep a laptop because I realize that it is impossible to future proof anything when it come's to technology, but when your a grad student working at a casino not making much money i can only do so much. I also am keeping in the mind the idea of expandability, i do realize that yes I may need ti upgrade this or that here or there. But hopefully I won't have to worry about doing something like for a few years to come...
 
Here's that bottleneck term I abhor so much. OK, since the monitor's refresh rate is 60Hz, with current graphics cards pumping out 200+FPS, isn't that the real "bottleneck"? And don't even get me started about. "persistence of vision", or the fact that feature movies are run @ 24 FPS, which is sufficient to eliminate "flicker". So, at the end of the day, our brain is the real bottleneck.

When you talk about CAD (or similar) software, workstation graphics are more precise and accurate, because they are optimized for vertexes and vectors, instead of textures, hence they can be considerably faster than their discrete cousins (forget about onboard stuff for this comparison).

However, when it comes to discrete graphics, which are optimized to render textures or playing HD videos your comment makes much more sense.

Edit:
Just for reference, the difference between discrete and workstation graphics is clearly evident..
 
Back