Furthermore, the gradient descent and recursion least squares are used to optimize the parameters of the basis function and the linear weight of the output layer, respectively.

  • 同时为进一步提高网络性能,采用梯度下降法与递推最小二乘法混合学习策略,分别对基函数参数(中心与宽度)和输出层线性权值进行学习。
目录 查词历史