We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.
On-line algorithms are usually analyzed using competitive analysis, in which the performance
of on-line algorithm on a sequence is normalized by the performance of the optimal on-line
algorithm on that sequence. In this paper we introduce adaptive/cooperative analysis as an
alternative general framework for the analysis of on-line algorithms. This model gives promising
results when applied to two well known on-line problems, paging and list update. The idea is
to normalize the performance of an on-line algorithm by a measure other than the performance
of the on-line optimal algorithm OPT. We show that in many instances the perform of OPT
on a sequence is a coarse approximation of the difficulty or complexity of a given input. Using
a finer, more natural measure we can separate paging and list update algorithms which were
otherwise undistinguishable under the classical model. This createas a performance hierarchy of
algorithms which better reflects the intuitive relative strengths between them. Lastly, we show
that, surprisingly, certain randomized algorithms which are superior to MTF in the classical
model are not so in the adaptive case. This confirms that the ability of the on-line adaptive
algorithm to ignore pathological worst cases can lead to algorithms that are more efficient in
practice
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.