反向传播的梯度下降部分

如何解决反向传播的梯度下降部分

我正在尝试编写一个两层神经网络的简单 NN,正如我在此处描述的那样 https://itisexplained.com/html/NN/ml/5_codingneuralnetwork/

在通过反向传播计算外层和内层的梯度后,我陷入了更新权重的最后一步

#---------------------------------------------------------------

# Two layered NW. Using from (1) and the equations we derived as explanations
# (1) http://iamtrask.github.io/2015/07/12/basic-python-network/
#---------------------------------------------------------------

import numpy as np
# seed random numbers to make calculation deterministic 
np.random.seed(1)

# pretty print numpy array
np.set_printoptions(formatter={'float': '{: 0.3f}'.format})

# let us code our sigmoid funciton
def sigmoid(x):
    return 1/(1+np.exp(-x))

# let us add a method that takes the derivative of x as well
def derv_sigmoid(x):
   return x*(1-x)

# set learning rate as 1 for this toy example
learningRate =  1

# input x,also used as the training set here
x = np.array([ [0,1],[0,1,[1,1]  ])

# desired output for each of the training set above
y = np.array([[0,0]]).T

# Explanaiton - as long as input has two ones,but not three,ouput is One
"""
Input [0,1]  Output = 0
Input [0,1]  Output = 1
Input [1,1]  Output = 0
"""

input_rows = 4
# Randomly initalised weights
weight1 =  np.random.random((3,input_rows))
weight2 =  np.random.random((input_rows,1))

print("Shape weight1",np.shape(weight1)) #debug
print("Shape weight2",np.shape(weight2)) #debug

# Activation to layer 0 is taken as input x
a0 = x

iterations = 1000
for iter in range(0,iterations):

  # Forward pass - Straight Forward
  z1= x @ weight1
  a1 = sigmoid(z1) 
  z2= a1 @ weight2
  a2 = sigmoid(z2) 

  # Backward Pass - Backpropagation 
  delta2  = (y-a2)
  #---------------------------------------------------------------
  # Calcluating change of Cost/Loss wrto weight of 2nd/last layer
  # Eq (A) ---> dC_dw2 = delta2*derv_sigmoid(z2)
  #---------------------------------------------------------------

  dC_dw2  = delta2 * derv_sigmoid(a2)

  if iter == 0:
    print("Shape dC_dw2",np.shape(dC_dw2)) #debug
  
  #---------------------------------------------------------------
  # Calcluating change of Cost/Loss wrto weight of 2nd/last layer
  # Eq (B)---> dC_dw1 = derv_sigmoid(a1)*delta2*derv_sigmoid(a2)*weight2
  # note  delta2*derv_sigmoid(a2) == dC_dw2 
  # dC_dw1 = derv_sigmoid(a1)*dC_dw2*weight2
  #---------------------------------------------------------------
  
  dC_dw1 =  (np.multiply(dC_dw2,weight2.T)) * derv_sigmoid(a1)
  if iter == 0:
    print("Shape dC_dw1",np.shape(dC_dw1)) #debug
  

  #---------------------------------------------------------------
  #Gradinent descent
  #---------------------------------------------------------------
 
  #weight2 = weight2 - learningRate*dC_dw2 --> these are what the textbook tells
  #weight1 = weight1 - learningRate*dC_dw1 

  weight2 = weight2 + learningRate*np.dot(a1.T,dC_dw2) # this is what works
  weight1 = weight1 + learningRate*np.dot(a0.T,dC_dw1) 
  

print("New ouput\n",a2)

为什么

  weight2 = weight2 + learningRate*np.dot(a1.T,dC_dw2)
  weight1 = weight1 + learningRate*np.dot(a0.T,dC_dw1) 

完成而不是

  #weight2 = weight2 - learningRate*dC_dw2
  #weight1 = weight1 - learningRate*dC_dw1 

我没有得到通过乘以前一层的激活或相同的直觉来更新权重的方程的来源。我希望代码简单明了。

这里问它,因为它涉及到一些代码;也在这里https://ai.stackexchange.com/questions/26920/updating-the-weights-in-back-propagation

解决方法

暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!

如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。

小编邮箱:dio#foxmail.com(将#修改为@)

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。

相关推荐


Selenium Web驱动程序和Java。元素在(x,y)点处不可单击。其他元素将获得点击?
Python-如何使用点“。” 访问字典成员?
Java 字符串是不可变的。到底是什么意思?
Java中的“ final”关键字如何工作?(我仍然可以修改对象。)
“loop:”在Java代码中。这是什么,为什么要编译?
java.lang.ClassNotFoundException:sun.jdbc.odbc.JdbcOdbcDriver发生异常。为什么?
这是用Java进行XML解析的最佳库。
Java的PriorityQueue的内置迭代器不会以任何特定顺序遍历数据结构。为什么?
如何在Java中聆听按键时移动图像。
Java“Program to an interface”。这是什么意思?
Java在半透明框架/面板/组件上重新绘画。
Java“ Class.forName()”和“ Class.forName()。newInstance()”之间有什么区别?
在此环境中不提供编译器。也许是在JRE而不是JDK上运行?
Java用相同的方法在一个类中实现两个接口。哪种接口方法被覆盖?
Java 什么是Runtime.getRuntime()。totalMemory()和freeMemory()?
java.library.path中的java.lang.UnsatisfiedLinkError否*****。dll
JavaFX“位置是必需的。” 即使在同一包装中
Java 导入两个具有相同名称的类。怎么处理?
Java 是否应该在HttpServletResponse.getOutputStream()/。getWriter()上调用.close()?
Java RegEx元字符(。)和普通点?