Your bring up some good points in your reply. I honestly didn't consider other forms of solar generation other than photovoltaics (PC) since my school does a lot of work with NREL. I apologize for the faulty link, there. It summarized that it requires about 10 acres to produce a MW.
The confusion here, or at least to me, is the discrepancy of units between a Watt and a Watt-hour. A Watt is a measure of energy per unit time, usually a second, so saying that it requires 10 acres to produce a Megawatt is saying 10 acres produce 10 million Joules of energy per second. A watt-hour is energy over an hour, so the 10 acres per MW to .001acres/MWh accounts for the discrepancy in the unit change there.
As far as the discrepancy in the number I used for my land area calculation, I went to Wolphram alpha and entered "United States Power Consumption" and it said "Result =3.873trillion kWh/yr", I believe the 442.1 GW at the bottom is an example of unit conversion, and pulling from wikipedia and DOE, I believe the 3.873trillion kWh/yr (or 3.873*10^15 Wh/yr.)
http://en.wikipedia.org/wiki/Energy_in_the_United_States states 29PWh (29*10^15 Wh) in 2005 (wolfram uses 2008). DOE:
http://www.eia.gov/emeu/aer/pdf/pages/sec1_13.pdf interestingly enough, it seems wolfram is a factor of 10 off of the DOE numbers, and given the wolframalpha's source is wolframalpha's database. I'll let you draw your own conclusion there.
The transmission point you retaliated is also a good point, but I was just trying to gage roughly how much land area is needed. The energy density for nuclear is much higher, so it isn't much of a concern, but that is true. It's just the idea of putting all the solar fields in the sunniest regions and exporting the energy would lead to greater inefficiencies.
As for the capacity factor, I tracked the chart back to the Department of Energy to figure out what it means. In brief, it states that it is the availability of the technology to produce power at any given time. So the solar technologies, both PV and thermal, are comparatively low due to weather, day/night cycles, and the ability of the plant to meet demands when required. So technologies like natural gas combustion-driven turbines are ranked 30% because they are only used at peak times, so operators shut them off when they aren't needed. The report specifies that "intermittent reliables" are not operator driven, so I think they run them whenever they can. Increasing these factors much higher than where they currently are will be exceedingly difficult, given the efficiencies of the technologies and percentage of sunny weather. Link to full article is here
http://www.eia.doe.gov/oiaf/aeo/electricity_generation.html
As far as nuclear worst-case disasters go, the newer generation reactors are designed to prevent these even better. As I was said before, the technology has made leaps and bounds in the past 40 years since most of the current reactors were put in place, making them even safer then before. While there is an inherent risk with the power plants, my personal belief is that the events at Fukushima are probably the worst-case possible with current technologies, but that is open to debate. As far as the costs of the enrichment process, I'm not entirely sure about that myself, but I believe that would fall into the cost per kWh in the DOE/wiki table I referenced, as well as the cost of fuel per kWh.
I've kind of spent the extent of my knowledge on the subject, but I hope I've helped a little bit. Don't get me wrong, as soon as I can afford to I'm installing solar for my own use, and believe anyone who can, should, but until a new infrastructure can be brought about, I believe nuclear is the best option, if even only an intermediary one.