Smoothing reminds me of a job I had. We collected data from power plants and presented it to the operators as Efficiencies (i.e. how efficient a turbine was running).
We "smoothed" our data also. By time weighting the data, we avoided "spikes" in our output. It was possible to get a faulty reading from an instrument (due to any of the steps involved in the process of taking that reading). Smoothing (time weighting), made those hiccups disappear. Those readings also had variability in them. Some of the calculations (around boilers) were sensitive to temperature and mass flow rate inputs (given that these are the two major constituents of a heat balance, that would seem absurdly obvious).
All this is to say that "smoothing" has its place. Most specifically when dealing with the vagaries of real time systems trying to mesh with first principle calculations.
I continue to question the validity of such smoothing in non-realtime analysis. A time versus temperature/time versus concentration scatter graph would probably be more honest. Trouble with such a graph is that they start to look like big fat lines (especially in the case of temperature).
Now I catch myself say "There ought to be a law" about graphing.
But I will keep my mouth such. The unintended consequences of a Minister of Graphing and the resultant police force cause me to quiver in fear.
The real answer is of course educating our children to understand graphs. That is not easy. My children will be okay, but I am not sure about my friends and neighbors.