A while ago I was taking my blood pressure every morning, with an apparatus that eventually proved so unreliable as to be useless. Tabulating the results, I noticed a prevalence of zigzags: If the reading was higher yesterday than the day before, that seemed to make it less likely that today's reading would be higher than yesterday's. That seemed plausible in that an increase would make it less likely that the latest reading was below the mean. Inquiring about this on alt.sci.math.probability, I was rewarded with the following argument: Suppose that the readings are independently & identically distributed (probably a good approximation, tho of course one can imagine events that would affect the blood pressure for periods longer than a day) and that ties are not allowed (not actually true, but the precision of the readings was such that ties never actually occurred). Then, of a successive triple of readings, all orderings are equally probable -- say 123, 132, 213, 231, 312, 321, where the numerals are mere ordinals representing relative magnitude. Of these, three (123, 132, 231) present an initial increase, of which only one (123) presents yet another increase. Thus, the odds are 2 to 1 against a further increase; and likewise, after a decrease, the odds are 2 to 1 against a further decrease. Thus, strings of sawteeth are fairly probable.
This can be generalized: Suppose one has experienced a sequence of n increases; what are the odds against the next reading being higher still? Among the n+2 numbers then in hand, there are (n+2)! equiprobable permutations, but of these, only n+2 begin with an increasing sequence of n+1 numbers, corresponding to the n+2 ways of leaving out one of the n+2; and of those, only one has the n+2nd larger than the rest. So the odds against a further increase are n+1 to 1. Each increase makes a further increase modestly less probable.
It surprises me that this result is independent of the distribution. In a hasty effort, I did not succeed in finding it in Feller or on the Web.