Basically, my point is that your wage rate at your job does not define the value of your time. It's the marginal overall utility rate of your best option. I didn't focus on this explicitly because it's not a point of contention. The only point of contention is with the simplistic economic argument that goes something like this: "It's silly to do something unpleasant that saves you $x/hour when you could work an extra hour instead and make $y>x." There are many things that go into determining whether that trade-off is worth it, but all else equal, it's still fundamentally an invalid argument because of the discrete choice set in salary/workload for most people.
Some caveats to that specific point about discreteness:
First, as Vinci pointed out, your salary isn't the only payoff you get from work. Working an extra hour for "free" could increase your future promotion prospects or successes enough to be worthwhile (this is obviously very relevant to graduate students who make hardly any money but work constantly...) This is very true and applicable to many people. Yet still, the wage rate is not the correct comparison, it's the marginal present discounted return to effort. You can try to estimate that, but approximating by wages is very wrong.
The same thing applies to any other utility that you derive or lose from working. If you really love your job, by all means, work the extra hour. And if your income really is a continuous choice, if you're a self-employed carpenter for example, this isn't an issue (but farther down, something else is.)
Second, a couple people commented that over a person's lifespan, any discreteness in the income choice set is negligible. Well, I don't believe that's true at all, since it's costly to switch jobs lots of times and the possible income/work tradeoffs we can choose after deciding approximately what we want to do for a living and realizing our strengths and limitations are quite restricted. But maybe over a reeeeally long time span... But that means that when deciding whether to take a cab or walk, you're supposed to look ahead for sixty years and decide how to change your career path and effort level and other forms of consumption and leisure at each point in time to earn an additional $30. Of course no one realistically does that, for many perfectly legitimate and non-psychological reasons that I won't bother to enumerate. When faced with a choice between abandoning a simplistic model and rescuing it with heroic self-evidently wrong assumptions, I'll go with the former. (Anti-behavioralists please take note.)
Of course, even if the choice set is discrete even in the long run, long-term considerations are relevant to the extent that we do have an approximate idea of what our future income will or could be. We don't make decisions completely in isolation, we decide whether eating out once a week is something that's worth it in general. If we really want to be able to do that, it makes sense to go for a higher salary job. But, I assumed away these non-marginal decisions by considering a person who had already chosen their salary as close to optimal as possible (which for many people, ie the people who originally motivated my argument, such as people working 39 hours per week at a fixed hourly wage, is still very far from optimal). I certainly encourage thinking about these marginal decisions non-marginally, but only because it's very hard to estimate our true valuations when the quantities in question are very small.
I also apparently encourage marginal usages of the word 'marginal'. But never uses of the word marginal to mean 'of secondary importance'. Hmm, that last 'marginal' was at the crucial margin; 'marginal' doesn't make sense as a word anymore...
Anyway, I think this is really the core of the disagreement. Economists apparently think that our choices of income are much more continuous than they really are. (Yeah I'm so sure of my opinion on that one I'm not even saying that they think our choice set is more continuous than I think it is, but more than it is in reality =)
While the following isn't a point of contention, it was a large point of confusion (my fault for not being clear). I said that it was possibly reasonable to choose less work even if the total time spent on trivial low-wage tasks is more than the unit of work/salary you can increment by. That's just saying that utility from work is the sum of wages and subjective experiential utility, which for me decreases drastically after a certain point. You have to pay me a hell of a lot more than 50% extra to work 60 hour weeks than 40. Even if I love my job (which I do, currently), my enjoyment is concave in hours worked. If I've just spent 80 hours doing research, I'd rather vacuum the apartment or change the oil in my motorcycle than work 81, even though I obviously chose my job over being a maid or mechanic. This is of course extremely obvious/intuitive, but it does not fit into economic toy models that ignore subjective utility or allow for only one kind of work and one kind of leisure.
Anyway, to sum up, your value of time is determined by a whole lot of things and is definitely not well approximated by your wage rate, or even necessarily by your utility-rate of work. Your wage rate does of course affect your value of time in other ways, most importantly I would guess by affecting your marginal value of money itself. Someone making $100 an hour is likely hardly affected by spending $30 on a cab; that's not true for someone making minimum wage. That's true even if the minimum wage guy loves his job and the other guy hates it just enough so that their utility-rate of work is exactly the same.
Please let me know if I'm relying on any more implicit assumptions that I should spell out =)