No, it is not black and white. It depends on application if it is better to use approximation or not. For example, there really is nothing bad in using approximation if it gives good enough result, lets say like in games for meaningful speedup. But if you would be doing some really meaningful calculation you should not use approximation.
Second example, in industrial controls, there is lot of applications where you need to calculate complex things, but in most cases approximation is lot faster and gives good enough result. As machines have error margins themself, there is no point doing some calculation with only small improvement over approximation when machine (mechanical things you know) itself limits the control accuracy.
Bookmarks