Postural stability during illusory self-motion—interactions of vision and touch

Visual motion affects posture

As discussed in the Introduction, it is well-established that visual motion induces changes in postural sway. Our task elicited illusory self-rotation and displacement (SR&D) using continuous 360° full-field virtual scene rotation of a well-structured room environment. Our findings revealed a distinction between environmental rotation (VRS) and illusory SR&D. In general, center of pressure (CoP) and weight force (WF) fluctuations were greater during SR&D than during the periods when only a moving visual scene was perceived. When the structured visual environment rotated in yaw, body sway was perturbed in all directions, as reflected by changes in the mean fluctuations and Stabilogram Diffusion Function (SDF) analysis. Furthermore, SR&D significantly increased postural sway compared to the sway observed during VRS. These effects were evident in specific parameters of the SDF, namely the Exponent and the Critical Timescale (Table 1B).

Normally SR&D perception involves multisensory stimulation—optic flow, vestibular inputs that provide independent information about head movement, and proprioceptive and somatosensory information. A rotating visual environment in an HMD elicits conflicting sensory inputs over time, for example, when SR&D is perceived, there are no signals from the semicircular canals to indicate motion onset. One interpretation could be that the sensory conflict increases postural sway. Alternatively, Gibson (1966) argued that multisensory patterns can provide orientation information without requiring separate sensory inputs to compete. In our study, no change in vestibular semicircular canal or otolith organ input was triggered as the perception of self-motion was utterly illusory. Signals received by vision alone were strong enough to bring changes in postural balance, reflected by a noticeable increase in weight loading fluctuations and by the center of pressure fluctuations seen in the stochastic measures. When proprioceptive and somatosensory information were provided from the fingertip touch, they greatly reduced postural sway.

Start with motion vs. start with no motion

Statistical analyses across all variables revealed a significant main effect dependent on the order of the motion phase (i.e., starting with a Stationary Scene transitioning to Scene Motion, or vice versa). When the task began with visual rotation and no touch cue was provided, the moving visual input destabilized postural control, resulting in a disruptive aftereffect once the scene transitioned to stationary. Interestingly, motion order did not affect the average CoP fluctuations in any direction. Its influence was exclusively observable in the SDF measures, which are sensitive to changes in stochasticity. Notably, the Diffusion coefficient, particularly in the lateral direction, emerged as the only parameter sensitive to motion order (Table 1B). During touch trials, motion order had no main effect on the touch force measures. However, a significant interaction effect between motion phase and motion order was observed, indicating that the stochastic aspects of postural balance and haptic touch characteristics varied with different motion perceptions, depending on whether the trial began with motion or ended with it.

Visual motion aftereffects might contribute to this effect. Andreeva (2016) found visual motion aftereffects of opposite sign could be induced by prolonged exposure to a moving environment. In our task, the sudden stop of scene motion might induce eye movements in the direction opposite to prior optokinetic nystagmus. This also could account for the slight discomfort, as reported by several subjects during stopping of the scene. In future studies, we will use eye tracking to measure how the eyes move during both the onset and stopping of visual motion.

Touch vs. no touch

In all test conditions, non-supportive fingertip touch significantly reduced the mean fluctuations of the center of pressure (CoP) in both the AP and ML directions when visual motion was introduced (see Figs. 4 and 5). Notably, perceptual changes caused by moving visual input did not significantly interfere with the stabilizing effect of fingertip touch; touch consistently reduced postural sway across all conditions. Significant differences were also observed in the SDFs of CoP and weight (WF) fluctuations due to touch. This effect emerged in specific stochasticity-sensitive parameters, such as the Area of the SDF profile and the Diffusion Coefficient (Table 1B), but not in the Exponent or Critical Timescale. This contrasts with the effects of motion phase, which were observed in the latter pair but not in the former.

In our experimental design, the touch plate was always positioned to the right side of each subject, regardless of handedness. While this could be a limitation, it is mitigated by the fact that 13 of the total 14 subjects were right-handed. Out of the 14 subjects, 9 experienced the touch plate moving in coordination with their fingertip when contact was made, 1 subject felt the touch plate switch from not moving to moving with the fingertip, and 4 subjects did not perceive any touch plate motion. Regardless of the perception of touch plate motion, light touch consistently reduced postural sway. For the 10 subjects who experienced the touch plate moving with their finger, the motion of their fingertip and the touch plate were perfectly synchronized with their perceived body rotation, as though the plate were an extension of their own body.

Previous research (Bakshi et al. 2019) demonstrated that voluntary swaying in a slow rotating room (60°/s), was influenced by Coriolis perturbations, which significantly deviated the subjects' sway. This postural sway was also attenuated by active light touch and best described by stochastic measures. Similarly, postural sway induced by visual environment rotation in the present experiment followed a stochastic trajectory. SDF measures were instrumental in characterizing the changes in postural sway patterns induced by altered visual motion perception, as well as by the presence and absence of touch, and in understanding how these factors affect different aspects of postural stochasticity. The significant reduction in postural sway with fingertip light touch, observed in both actual and virtual rotation environments, underscores the role of touch in reducing the stochasticity of sway.

Touch magnitude

When visual motion was introduced, the touch forces significantly varied across different motion phases in AP and Z directions (Table 3A). The average touch force magnitudes increased with motion, with the smallest values observed during the stationary scene (SS), intermediate values during the visual rotating scene (VRS), and the maximum forces during self-rotation and displacement (SR&D). Notably, the vertical touch force consistently hovered around 40 g during SR&D, while it was 5 g lower during the stationary phase (Fig. 7A). This aligns with previous research, which identified 40 g as the threshold for maximum cutaneous sensitivity in the fingers (Johansson and Westling 1987; Westling and Johansson 1987). Thus, under self-motion conditions, the central nervous system (CNS) appears to automatically tune into the optimal tactile sensory sensitivity. The average fluctuation amplitudes of the touch force also varied across different motion phases in all three cardinal directions. However, in contrast to the average force magnitudes, which were lowest during the SS phase, the fluctuation amplitudes were smallest during VRS, with both SS and SR&D exhibiting amplitudes significantly higher than those observed in VRS (Fig. 7B). We also observed a significant interaction between motion phase and motion order on the touch force measures.

Overall, these findings are consistent with touch belonging to an haptic long-loop cortical reflex that adjusts postural sway using somatosensory and proprioceptive information related to the activation of a complex network of areas, ranging from motion-specific areas to regions involved in visuo-vestibular integration, visual imagery, decision making, and introspection (Kovács et al. 2008). The modulation of balance through light touch contact is remarkably fast (250–300 ms), with EMG activity onset in the leg muscles controlling sway detectible at 150 ms and force generation to counter sway present 150 ms later, well below conscious reaction times (Jeka and Lackner 1995; Rabin et al. 2006).

Interplay of visual motion perception, order and tactile feedback in postural stability

Touch generally stabilizes balance, but our findings suggest a complex interplay between visual perception and the tactile system in the acquisition of postural stability. Interactions between motion phase and motion order emerged in both classical and SDF analyses for touch force control (Table 3) but only in the SDF analysis for postural control (Table 1). The significant two-way interactions in the SDFs of all three fluctuation measures—CoPAP, CoPML, and WF—highlight the systematic impact of motion sequencing on the stochastic attributes of postural stability. Notably, all significant effects were driven by a single dependent variable, the Critical-point (Table 1B), suggesting that the temporal-scale structure of stochastic fluctuations in balance dynamics is strongly influenced by the temporal sequence of different visual motion perceptions. The interaction between motion phase and motion order in the touch force analysis perhaps suggests that the sequential execution of motion phases has an integrated effect on force application, possibly reflecting adaptive strategies to maintain stability across varying movement conditions, but with lingering aftereffects. A possible explanation for this effect could be that when trials began with a stationary scene, vertical touch force fluctuations remained relatively unchanged across different motion phases; however, when trials started with scene motion followed by no motion, touch forces were greater when SR&D was perceived, and this effect persisted into the subsequent stationary phase. Interestingly, in the SDF analysis, the interaction between motion phase and order emerged selectively in the tangential touch force components but not in the normal component. The absence of an effect on the normal component suggests that vertical force may serve as an anchoring mechanism for balance maintenance, remaining stable within each balancing context and rapidly adapting to changes in visual perturbation, regardless of sequence. Future studies will further investigate whether lateral and fore-aft touch force adaptations are more sensitive to motion sequencing than vertical force adjustments. If confirmed, this would reinforce the idea that while tangential forces may play a key role in compensatory movements and fine adjustments to external visual perturbations, they do not directly govern balance stability, and therefore they may exhibit a hysteresis effect or latency in switching, which could explain why changes in motion order influence how subjects regulate forces applied along the horizontal plane. Understanding these dynamics could offer deeper insights into sensorimotor integration and the mechanisms underlying balance adaptation in varying sensory environments.

Distinct effects across directions

The factors of motion phase, motion order, and touch exhibited unique effects along the three cardinal directions—fore-aft (AP), lateral (ML), and normal (Z). For instance, motion order had a stronger impact on the Diffusion Coefficient parameter of the SDFs of lateral fluctuations (i.e., CoPML and WF) compared to the fore-aft direction. In contrast, the distinction between SR&D and VRS in motion phase was only noticeable in the Exponent parameter of the fore-aft CoP fluctuation SDF, but not in the lateral direction. Motion phase significantly altered the average touch force magnitudes and fluctuation SDF parameters in the fore-aft and normal directions, but had no effect on the lateral ML direction. Interactions between the phase and order of motion were significant in the fore-aft and lateral touch force fluctuation SDF parameters, but not in the normal force component.

SDF parameters capture distinct effects of motion phase, order and touch on balance

For each SDF trace, four dependent variables (DV) were derived: Area under the SDF curve (AUC), Diffusion Coefficient (D), Critical Timescale (TC), and Hurst Exponent (EH). The three independent factors—motion phase, order and touch—significantly influenced certain SDF parameters of CoP and touch force fluctuations, while others remained unaffected. For the CoP fluctuations, the motion phase factor significantly affects EH and TC but does not influence the AUC or D. The Hurst scaling exponent measures the self-similarity and memory of a time series, indicating persistence (EH > 0.5), anti-persistence (EH < 0.5) or random walk behavior (EH ~ 0.5) of fluctuations over time. In this experiment scaling exponents were always found to be > 0.5 (see Table 2), meaning CoP movements exhibit persistence, where past movements are correlated with future movements in a consistent direction. Table 2 shows that the exponent of VRS > of SR&D > of SS, indicating that the vection illusion increased CoP fluctuation persistence, with the onset of visual motion amplifying this effect. Greater persistence implies a more pronounced drift-like component, which could stem from enhanced anticipatory control in maintaining balance. The Critical time (TC) represents the transition between short-term (open-loop) and long-term (closed-loop) control, marking the timescale when active postural control begins to dominate over passive control. A shorter TC suggests a rapid response in postural control, whereas a longer TC may indicate delayed corrective mechanisms. Table 2 shows that the two motion phases significantly reduced TC compared to SS, implying that motion stimulus required faster neuromuscular corrections than stationary condition.

The second factor, touch, influences AUC and D but does not influence the complementary parameters EH and TC. AUC represents the total extent of postural sway over different timescales, where a larger AUC suggests increased instability, while a smaller AUC indicates more controlled postural sway. Figures 5, 6, and 7 support this expectation, showing a significant reduction in AUC with touch.

The effect of touch on the Diffusion constant is especially important. In SDF analysis, D reflects spontaneous postural sway driven by biomechanical noise or exploratory behavior, where higher values suggest greater sway instability due to weak neuromuscular control. Table 2 shows about a four-fold reduction in D with touch, indicating a significant decrease in spontaneous postural sway-induced CoP fluctuations. The third factor, motion order, only affects D, with no effect on the other three parameters. In summary, motion phase affects the passive versus active aspects of balance control strategy, while touch and motion-order primarily influence the noise-dependent stochastic aspects of postural sway.

The SDF parameters of touch force fluctuations indicate: (i) The motion phase factor significantly changes AUC and D for fore-aft touch force fluctuations, has no effect on the lateral direction, and only affects AUC for the normal force component. (ii) The contrast between ambient motion versus self-motion phase is encoded only in AUC and D, with no influence on EH or TC.

Relevance for competing views of how posture and touch are stabilized

Our results have implications for understanding the basis for finger touch contact being able to stabilize balance. The prescient remarks of Gibson provide an important background: “To perceive is to be aware of the surfaces of the environment and oneself in it. The full awareness of surfaces includes their layout, their substances, their events, and their affordances. Note that this definition includes within perception, a part of memory, expectation., knowledge, and meaning -some part but not all of those mental processes in each case” (Gibson 1979, p. 255); and, “We are not accustomed to think of the hand as a sense organ since most of our day to day manipulation is performatory, not exploratory. …. The perceptual capacity of the hand goes unrecognized because we usually attend to its motor capacity….” (Gibson 1966, p. 123).

There have been several interpretations of how touch is related to stabilization of posture. Our original studies related the stabilization as being akin to a form of precision grip in which the finger and legs served as the two “pincers”. Given the short latencies of posture stabilization onset below conscious reaction times, we related the stabilization to the action of a long loop cortical reflex, also known as a transcortical reflex (Wiesendanger et al. 1975; Holden et al. 1994; Jeka and Lackner 1994). An alternative view has been that of functional integration or that posture stabilizes to afford stabilization of the hand contact with the surface. The functional integration hypothesis (or suprapostural task hypothesis) argues that posture is adaptively shaped to support the execution of supra-postural tasks and its modulation is subordinate to the demands of goal-directed behaviors (Riccio and Stoffregen 1988). Evidence for this hypothesis is found in postural sway attenuation and modulation in service of task performance during activities requiring fine motor control—such as visually guided movements, object balancing, manual aiming, and precision touching (Riley et al. 1999; Balasubramaniam et al. 2000; Stoffregen et al. 2000; Wulf et al. 2004; Haddad et al. 2013; Chen et al. 2015, 2018; Chen and Tsai 2015).

The functional integration hypothesis contends that such stabilization occurs only when task-specific precision is required. Riley et al. (1999), for example, found no sway reduction in a task demanding minimizing contact-point variability when subjects made contact with a pliable curtain. Other work has shown that rigid surfaces elicit stronger stabilizing effects than compliant ones (Franzén et al. 2011; Mauerberg-deCastro et al. 2014; Batistela et al. 2018; Moraes et al. 2018). A potential reason for diminished postural modulation when touching without a goal, or touching non-rigid contacts, could be that traditional sway metrics are overlooking subtle effects in more compliant contact conditions. This raises the question whether the effects of touching flexible surfaces could be aided by more sensitive measures—such as SDF analysis, which gives a detailed time course of control.

It is plausible that the postural control system flexibly integrates both sensory feedback and task constraints, with the relative influence of each depending on the specific haptic context. Several proposals have attempted to develop an integrated framework bringing competing hypotheses to a state of cooperative reconciliation, wherein each mechanism aptly fits in specific domains. For example, Mitra’s (2003) adaptive resource-sharing model suggests that under conditions of increased postural threat (e.g., reduced base of support, sensory challenges), sensory feedback is prioritized to maintain stability, and conversely, in stable configurations, the postural system allocates resources to support supra-postural goals when they are present. Chen and Tsai (2015) propose that the postural system operates as a flexible, context-sensitive mechanism even under safe, upright conditions.

We agree with the notion of supra-ordinate representations of postural control in touch stabilization experiments and in any tasks involving directed actions of the hands, arms, legs and whole body. Our view is that such control represents a physical necessity in order for the tasks to be carried out.

Previous work has dealt with how establishing finger contact with a surface can affect posture and the calibration of the arm. For example, when we reach and contact a target location on the surface, there is a brief period when the three-dimensional reaction force on the touching finger points at the shoulder of the reaching arm. This maintains the accuracy of arm directional calibration (Lackner and DiZio 2000). In touch of the finger to a surface, stabilization occurs in less than ≈ 150 ms, EMG activity is evoked in the muscles that will act to stabilize posture in another 150 ms. Even finger contact with a von Frey filament can, in subjects standing heel to toe, provide some stabilization at finger force levels of 5–10 g; by contrast, holding the finger above the force plate attempting to maintain it at an imagined location has no stabilization effect (Lackner et al. 2001). When a subject is standing heel to toe and the peroneus longus and brevis tendons of one leg are vibrated to elicit a tonic vibration reflex, the subject’s balance is greatly compromised. However, allowing the subject fingertip contact with a stable surface stabilizes the subject to the level of that seen with touch contact without vibration. Many subjects report that it feels as if the vibrator is not on (Lackner et al. 2000). The efficacy of touch stabilization has been shown for Parkinsonian patients as well. When such an individual has trouble in initiating locomotion, they can be unstuck if they lower their finger to touch a moving belt that “unsticks” them so that they can walk the full length of the moving belt (Rabin et al. 2015). Our studies are a testimony to Gibson’s wisdom that “The perceptual capacity of the hand goes unrecognized because we usually attend to its motor capacity…”. We are only beginning to glimpse the full range of his insights.

Comments (0)

No login
gif