I mean, big O comes from that particular branch of math. The whole point of it -- even in CS -- is it's asymptotic analysis. (It's also been long criticized in CS for this deficiency. Usually it's the hidden constant factor, but the minimum problem size before it's relevant is an issue too!)
It's fine colloquially misusing big O to mean something like "for usual inputs, the running time is roughly this function", but you can't then in the same breath say all practical algorithms are technically O(1), because that's just not true in the technical sense (and it's useless or at least inaccurate in the colloquial one).
It doesn't really matter, but to add more to the discussion, the way I see it is that computer programs are implementations of algorithms on some restricted set of inputs. The algorithm has the asymptotic analysis, but the program itself does not.
> It's fine colloquially misusing big O to mean something like "for usual inputs, the running time is roughly this function", but you can't then in the same breath say all practical algorithms are technically O(1), because that's just not true in the technical sense (and it's useless or at least inaccurate in the colloquial one).
My argument is "there's no reason to stick to the exact technical definition, because the exact technical definition breaks when you apply it to real computers". I don't think that's contradictory in any way.
But I don't think this is a misuse either. Talking about "usual inputs" and "roughly" would be a misuse, sure. But if I can prove it uses at most an^2 + bn + c steps, and I call it O(n^2), then what's the problem?
It's fine colloquially misusing big O to mean something like "for usual inputs, the running time is roughly this function", but you can't then in the same breath say all practical algorithms are technically O(1), because that's just not true in the technical sense (and it's useless or at least inaccurate in the colloquial one).
It doesn't really matter, but to add more to the discussion, the way I see it is that computer programs are implementations of algorithms on some restricted set of inputs. The algorithm has the asymptotic analysis, but the program itself does not.