Fig. 2: A Comprehensive Pipeline for Neural Data Analysis.

A Tensor construction: High-dimensional neural signals collected are transformed into a tensor data structure. B Tensor decomposition, using Tucker decomposition as an example. C Tensor inner product: The tensor inner product is estimated by approximating the core tensor and factor matrices obtained from tensor decomposition, which is then incorporated into a kernel functionϕ. Here, Xi represents the ith tensor, \({A}_{1r1}^{i}\) denotes the ith row of factor matrix \({A}_{1}^{i}\). D Tensor mapping: The mapping of an inseparable tensor space to a linearly separable tensor space. E Least Squares Support Tensor Machine: A model that utilizes least squares regression for tensor classification. Here, yi represents the label (y=1 or −1) of the ith tensor Xi, and ei denotes the tensor error. Further details can be found in the Methods.