The classic analysis of river flow recession hydrographs Q(t) developed by Brutsaert and Nieber (1977) leads to the graphical representation of the relation ln (−d Q /d t ) = ln[f(Q)] in a form of scattered points cloud. The paper presents the analysis of measurement errors of river flow values as one of possible factors generating this dispersion. The relevant numerical experiment shows the high similarity between the set of points obtained as a classic analysis result and the experiment output. It has been proved that constant time steps between consecutive flow measurements subject to random errors generate a systematic error of the analysis results and should be replaced by intervals of variable duration, resulting from the condition of constant stage/depth decrements.