if you worked only 80% but is able to generate the same revenue as working at 100%, then surely one of the following must be true: you either slack off the 20% of the time, or the company is overpaying you for the work you do!
Therefore, it must be assumed that if you worked only 80% of the time, you're only 80% effective. So the fixed cost of an employee doesn't decrease when they work only 80% of the time - so it becomes more expensive than just cutting the salary by 20%.
> Therefore, it must be assumed that if you worked only 80% of the time, you're only 80% effective.
This is the exact logic that is causing the issue. I would posit that a statement like this could only be true in an environment such as a factory line, where the output of widgets is truly linear over time.
Study after study have shown that for information workers, there's a strong trend of diminishing returns as workers clock more time.
Modern offices are truly an exercise in Parkinson's Law.
Further, I would speculate that the way companies encourage employees to optimize for duration of time on task leads to an insidious effect of individuals adopting less efficient methods to complete work, as there is no incentive to finish tasks more quickly within a timeframe when any time saved isn't recovered to the individual.
Depends on the type of work you do and what the company pays you for. It is the means (your time)? Or the outcome?
Had quite a few clients and employers asking and haggling over trying to get it both ways.
If you consider a creative job where you only care about the output, then the more creative it is, the less (above an undetermined threshold) time spent working and output are correlated.
The means (again, time spent) not being an accurate predictor of the outcome, they don’t matter much, and there is no reason to tie the salary to them.
If you consider a non-creative (factory / fruits picking) or vaguely creative (low-level factory-style coding, data input) job, then the less creative it is, the more time spent working and output are correlated (up to a physical / mental exhaustion threshold).
Here, the means are a good predictor of the outcome, hence the salary being tied to the number of hours spent working.
And it really comes down to what most of the IT/Software people are. I have a feeling though most of them have are closer to fruit picker but a lot think of themselves penning great literature when writing 10 line comments on 2 line function summing a pair of integers.
I currently work in a pretty rote software job, and I personally feel that even the most basic things can be very taxing on the mind and requiring of excellent focus. There are very few systems one can engage with where small changes can't have large ramifications in these environments, and great care must be taken.
You are confusing time spent with productive work done. Those two are not necessarily related. You also don’t take into account skill demand and supply. There is a very limited pool of good software developers.
if you worked only 80% but is able to generate the same revenue as working at 100%, then surely one of the following must be true: you either slack off the 20% of the time, or the company is overpaying you for the work you do!
Therefore, it must be assumed that if you worked only 80% of the time, you're only 80% effective. So the fixed cost of an employee doesn't decrease when they work only 80% of the time - so it becomes more expensive than just cutting the salary by 20%.