Simplified State Space Layers for Sequence Modeling

简化序列建模的状态空间层.

Diagonal State Spaces are as Effective as Structured State Spaces

对角状态空间和结构化状态空间一样有效.

Resurrecting Recurrent Neural Networks for Long Sequences

复活长序列的循环神经网络.

Combining Recurrent, Convolutional, and Continuous-time Models with Linear State-Space Layers

通过线性状态空间层结合循环、卷积核连续时间模型.

Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks

勒让德记忆单元:循环神经网络中的连续时间表示.

Efficiently Modeling Long Sequences with Structured State Spaces

通过结构化状态空间高效建模长序列.