The thing is water vapor distribution in the atmosphere is <b>not</b> uniform. It means optical depth in the thermal IR band is only weakly related to average specific humidity, it is mainly determined by higher moments of the distribution (a thin metal plate may be absolutely opaque, while a wire fence is almost transparent, even if it has the same amount of metal in it per unit area).
Overall greenhouse effect is set by IR optical depth (or rather, by its relation to SW optical depth), <b>not</b> by average concentration of greenhouse gases (water vapor included).
Therefore IR optical depth can either go up or down if concentration of well mixed components are increased. There is a good chance it remains pretty stable for a wide range of well mixed component concentrations (at a value of ~1.87), and what is required to have this effect is only a slight scale invariant redistribution of atmospheric humidity. Which can't be represented in gridded computational models of course by any means other than parametrization. What is more, it is not measured either so far, so any parametrization scheme is virtually free of empirical constraints.
It is high time to trash the current paradigm in climate science and start measuring optical depth as a function of frequency. It's not so difficult, only needs several satellites on high orbit, wideband imaging facilities installed with good temporal resolution and lots of surface based transmitters, operating in many narrow frequency bands, emitting a unique long period pseudorandom sequence in each band and at each transmitter. Signal to noise ratio of signal intensity measurement can be made high that way, which gives an optical depth map of the atmosphere with good frequency & spatio-temporal resolution.
I can't see why this age old hack, developed in radar technology is not applied in climate science. Of course, it may turn out average IR optical depth is in fact more stable than projected by computational models, so what? It is the way science is supposed to work, is not it? On the other hand it may turn out average IR optical depth is proportional to well mixed GHG concentrations with a specific constant of proportionality. Or it may depend on latitude. Whatever. One will never know until it gets <i>measured</i>.
The upshot is flatlining temperatures observed in the last one or two decades may be caused by a hidden, as yet unidentified homeostatic mechanism mediated by changes in fine details of water vapor distribution (never represented properly in computational models, neither measured ever). In this case any increase in carbon dioxide concentration (below a very high level) could only have a transient effect.
Of course, scale invariant changes in water vapor distribution (like changes in its fractal dimension) can have their own (probably second order) effects on climate, but one can't even start to identify them unil a proper framework is established.