You're working on a killer new app. Or a small niche website. Or really any kind of human-facing software. If you're like most developers, your primary focus is on functionality: before anything else, it has to work. How can anybody disagree with that? If it doesn't work, then what's the point? Functionality is indeed paramount to success (even for ridiculous outliers like Color), but I would argue that it is equally important to develop a great user experience (UX) before releasing the first public version. "Release early, release often" is a great model in many circumstances, but it should never serve as an excuse for poor quality - the "I'll just fix it in the next point release" mentality is very dangerous when overapplied.
Let's step back a bit. Why do I think that a great UX is just as important as functionality? The single word answer is perception. I'm working under the assumption that a major goal of whatever it is you're making is that your users actually like it. Unfortunately, making users like your software is not a simple and clear-cut task. A great user experience is required - yes, required - for general likability. Take, for example, Windows Vista. For the most part, it has been disliked. The reasons for this are numerous, but functionality is not among those reasons, as far as I know. I remember Vista being somewhat slow, severely lacking in third-party drivers, and extremely annoying due to the introduction of User Account Control (UAC). Why was it slow? Probably because a lot of features were added to it since XP and turned on by default. Why weren't there many third-party drivers? Because Vista was the first widely used 64-bit Microsoft client operating system, which meant that a lot of drivers simply didn't exist, as they all had to be recompiled and signed. At the same time, Microsoft improved the Windows driver architecture, causing incompatibility for certain classes of drivers, and manufacturers were slow to respond. And, of course, UAC popping up its confirmation dialog for every minor settings change and every install was technically more secure, but obviously flawed. Windows Vista was technically a lot more functional and secure than Windows XP, but because of its bad UX, it is now remembered as a failure.
But here is the really important thing. By the time Vista SP2 came out, all of the initial issues were pretty much fixed or had widely known workarounds, and yet the negative perception of Vista remained. Windows Vista's original bad UX forever tarnished its image.
The moral, which should be pretty obvious by now, is that first impressions really do matter, and when somebody finds your incredibly functional yet barely usable public 1.0 release, that person just might give up on your software altogether and put it on a mental blacklist. Cutting features is reasonable. Cutting UX is not. (Of course, there are always exceptions to the rule. I'm not saying that this is absolutely the only way to go. I just want everyone to think hard about the choices they're making instead of simply going with the flow.)