Reconciling Fitts' law with Shannon's information theory
Résumé
Shannon’s information theory has had a tremendous impact on various scientific fields in the 1950s and 1960s, including psychology [1]. In this period of time there has been a strong reaction of Shannon and colleagues [2,3] against the exaggerated use of information theoretic ideas in fields such as psychology, biology and linguistics. The fact is, Shannon’s mathematical theory of communication has been more or less put aside in psychology and is no longer considered useful today [4].
What seems to be an important exception is Fitts’ law [5,6], a well-known empirical rule which predicts the average time T it takes people, under time pressure, to reach with some pointer a target of width W located at distance D. The movement time is a linear function of the index of difficulty ID which is generally given by the so-called Shannon formulation ID= log2(1 + D/W ) [7] (an ISO standard) yet other similar formulations can also be considered [8].
Despite an attempt to theoretically explain Fitts’ law in terms of information theory [9], many different aspects are still unclear [10] and it seems that the only justification is a vague analogy with Shannon’s capacity theorem. Based on [11], we go beyond the analysis of [9] and derive a new mathematical approach that attempts to derive Fitts’ law from information-theoretic arguments:
We first present combinatorial arguments in various geometric frameworks to account for different formulations of the index of difficulty.
We then propose a simple communication channel model for rapid aimed movement leading to Fitts’ law, with discrete input (the intention of the participant) maximizing the information to be transmitted, uniform additive noise (maximizing entropy under the simple assumption of zero error) and continuous output representing the endpoint coordinates.
Finally, we show that the formulation ID= log2(1+D/W) can be obtained anew by a rigorous derivation of Shannon’s capacity theorem for our simple channel model, reconciling Fitts’ law with Shannon’s theorem 17: C = W log(1 + S/N ).
Domaines
Interface homme-machine [cs.HC] Théorie de l'information et codage [math.IT] Analyse fonctionnelle [math.FA] Analyse classique [math.CA] Mathématiques générales [math.GM] Théorie de l'information [cs.IT] Cryptographie et sécurité [cs.CR] Mathématique discrète [cs.DM] Traitement du signal et de l'image [eess.SP] Traitement du signal et de l'image [eess.SP] Statistiques [math.ST] Probabilités [math.PR]
Fichier principal
201509goririoulguiard.pdf (131.15 Ko)
Télécharger le fichier
201509goririoulguiard-slides.pdf (333.67 Ko)
Télécharger le fichier
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Origine | Fichiers produits par l'(les) auteur(s) |
---|