Hygrometer Calibration (Final)
Below are 4.3 days of data in a salt box. The hygrometer was calibrated at ~4500 minutes, and all the data below includes the applied adjustment factor (i.e., the vendor app adjusts all data retroactively). The hygrometer was left in the box another day following calibration. Over that time, measured RH continued to increase by about 0.5%.
When you do a salt-box calibration like this, there are two main factors that you are fighting against. First, the time delay for the ambient bulk humidity to equilibrate to the salt solution's RH. Second, the time delay inherent in all commercially available adsorption-based sensors like these. This is in addition to other typical instrument errors like non-linearity, hysteresis, noise, and overall bias.
I was hoping that comparing to a trendline would at least indicate when the box hit the salt's equilibrium point, but it looks like each jump in the Error plot below was just due to temperature fluctuations. However, comparing the change in that change (DError/Dt) over the course of the run does at least give you an idea of the random error for the sensor. Without knowing the specs of the exact chip itself that the hygrometer is using, I compared the change in short/mid/long period trailing averages (below) and said 'good enough' at 4500 mins.
The second item - inherent time delay for adsorption-based humidity sensors - is because the sensor substrate itself has a similar logarithmic boundary layer behavior. For the chips themselves, this is typically reported as 'response time' - i.e., time for sensor to reach 90% of actual value - and is usually on the order of a few seconds to a few minutes for most process-control sensors. This hygrometer is intended more for residential smart HVAC-type applications, so a much longer response time (~1 day based on the run below) - but some of that could be due to the first factor (time for ambient RH to equalize to salt equilibrium RH). Plus, it's a bit old and has been abused a bit - both of which would be expected to increase response time (e.g., as the substrate gets contaminated over time, etc).
That last 10% goes exponentially slower - so this one should probably take about 11 days total to reach 99% (1 day for first 90%, 10 more days for second 9%, 100 more days for third 0.9%, etc).
Point being that - at least for this sensor - the response time error compared to the timeframe over which you are trying to measure a change would likely be much larger than the other sources of instrument error. But, next item would be checking this against an actual process-control chip with a known response time, etc.