The Ivanpah plant isn't a great example. The tech is already going obsolete, and it's producing far below theoretical effectiveness. It should be able to produce a terawatt. Fundamentally, it's because solar is immature technology, and nuclear is mature. So what would the efficiency and cost be for a mature solar plant? And how do we get to maturity?
If a solar plant could hit 50% of the energy efficiency on the same land (and the poor-performing Ivanpah plant reveals that it's possible), that raises a question of why we need the cost and complexity of nuclear.
Solar Star (see adventured's comment parallel to yours) may be the current state of the art. It claims 747 MW on 3200 acres. That still doesn't put it in the same league as nuclear.
As a followup, I did some calculations and figure the US averages about 534twh of electricity - say it peaks at 1000twh, which seems reasonable. Both solar and wind appear to be capable of about 1twh per square mile these days, based on existing installations. So 1000 square miles of wind/solar, plus some battery caching, could provide 100% of our typical energy needs. This seems pretty feasible to me.
True, but does it have to be? We can throw more land at the problem. I live in the midwest, where we're starting to see the countryside dotted with windmills. If you can generate a terawatt per square mile, how many do you need?
If a solar plant could hit 50% of the energy efficiency on the same land (and the poor-performing Ivanpah plant reveals that it's possible), that raises a question of why we need the cost and complexity of nuclear.