I recall reading a while ago an KB article or a white paper that went into depth concerning the analysis backfill process and I'm looking for it again.
In it, there was mention that when 'delete existing data' was selected, the data was deleted and recalculated in chunks, maybe of a day or so. As such, most recalculated data went into the archive as OOO data, but if the end time was * then deleting the last chunk of data would include the current snapshot and thus that last chunk of recalculated values would go in as new snapshots and be subject to compression.
I've just mentioned this to a colleague and I'd like to be able to back it up with the documentation (and to reassure myself that I haven't imagined it).
Does anyone know the document I'm thinking of and can you point me to it, or anything else that confirms or refutes my understanding?