The initial value problem does not technically go away, but it becomes a decidedly smaller fraction of the uncertainty as time progresses relative to structural/parametric uncertainties in models, as well as in the particular scenario (e.g., emissions) one follows in the future. A good illustration of this is in Hawkins and Sutton (2009)
http://www.met.reading.ac.uk/~ed/publications/hawkins_sutton_2009_BAMS.pdf
Note at the end, Figure 3 and 4 (I particularly like 4c as an illustrative tool). The uncertainty in prediction is some combination of initial conditions, scenario, models, and the relative fraction of these competing uncertainties changes with time. The initial condition relevance drops off rapidly with lead time, while the scenario is rather unimportant until several decades out. A similar plot for the British Isles is shown- the importance of internal variability increases at smaller spatial scales and shorter timescales. Note that the decadal-prediction issue involves a rather complex intersection between all these sorts of uncertainty.
This is all well-known, and arises from the fact that the signal in a climate forcing will inevitably grow as the magnitude of internal variability stays roughly constant. The system itself is also constrained by the laws of physics, e.g., by the top of the atmosphere energy balance.
Such statistical predictability is not surprising. Summer in the NH is consistently warmer than winter. If the absorbed solar energy by the planet were to increase, the planet warms. Even global precipitation/evaporation is constrained by energetic arguments. As another example, the width of the Hadley cell can be approximated to first-order on the back of an envelope, and it’s trivial to explain, why for example, Venus has a near-global Hadley cell extent while on Earth it only reaches to near 30 degrees. A consequence of the pole-to-equator temperature gradient and rotation on Earth gives rise to baroclinic instability that we see manifest in mid-latitude cyclones ans associated warm/cold fronts. In the tropics, deep convection only sets in above a certain SST threshold (usually above about 26.5 C in the modern climate). New York does not have 10-year trends of Florida-like temperatures, although on individual days it can. A meaningful climatology can also be established of hurricane track patterns, etc (e.g., hurricanes do not spontaneously form around the Poles and travel around the worlds oceans). We see fairly predictable responses between different regions of the globe during ENSO (e.g., most El Niño winters are mild over western Canada, and wet in areas of the Southern U.S). We cannot predict an individual El Nino event 50 years from now, however, but fishing/agriculture, etc is highly sensitive and responds in a rather specific way to announcements of an upcoming El Nino/La Nina.
All of these things are part of the reference, equilibrium climate we are used to. If you inserted a small lake into a grid box in some climate model, and all of a sudden you got radically different climate sensitivity estimates and projections out to 2100, that might be more evidence that one cannot meaningfully do climate analysis. Yet there is no indication that the climate behaves in a way that is highly sensitive to initial conditions for longer-term projections, and explanations that invoke chaos ignore the observed fact that the climate behaves within constraints, globally and locally, that we have all grown accustomed to.