如何解决当两个矩阵的尺寸相同时,形状未在numpy ValueError中对齐
我写了这个函数:
def update_parameters(t0,t1,lrate,x_matrix,x_vector,y_vector):
# Create a 2x1 dimensional vector with t0 and t1
hypothesis_vector = np.r_['c',[t0,t1]]
prediction_vector = np.dot(x_matrix,hypothesis_vector)
error_vector = np.subtract(prediction_vector,y_vector)
# Debugging,remove later
print(f'Dimensions: {np.ndim(error_vector)},{np.ndim(x_vector)}')
print(f'Shape: {error_vector.shape},{x_vector.shape}')
print(f'Length: {len(error_vector)},{len(x_vector)}')
derivative_vector = error_vector * x_vector
# Calculate partial derivative of cost function.
t0_derivative = np.sum(error_vector)
t0 = t0 - (t0_derivative * lrate)
t1_derivative = np.sum(derivative_vector)
t1 = t1 - (t1_derivative * lrate)
return t0,t1
当我运行它时,我得到了这个追溯:
ValueError Traceback (most recent call last)
<ipython-input-114-10848703fdbf> in <module>
----> 1 test = update_parameters(0,1,0.5,trainset_x_matrix,trainset_x_vector,trainset_y_vector)
2 print(test)
<ipython-input-113-6c8da5390608> in update_parameters(t0,y_vector)
26 print(f'Length: {len(error_vector)},{len(x_vector)}')
27
---> 28 derivative_vector = error_vector * x_vector
29
30 # Calculate partial derivative of cost function.
/opt/conda/lib/python3.7/site-packages/numpy/matrixlib/defmatrix.py in __mul__(self,other)
218 if isinstance(other,(N.ndarray,list,tuple)) :
219 # This promotes 1-D vectors to row vectors
--> 220 return N.dot(self,asmatrix(other))
221 if isscalar(other) or not hasattr(other,'__rmul__') :
222 return N.dot(self,other)
<__array_function__ internals> in dot(*args,**kwargs)
ValueError: shapes (101,1) and (101,1) not aligned: 1 (dim 1) != 101 (dim 0)
我写了一些打印语句来获取x_vector
和error_vector
的尺寸,并得到以下输出:
Dimensions: 2,2
Shape: (101,1),(101,1)
Length: 101,101
,它们似乎是相同的。为什么说它们不对齐?当我搜索此错误时,它是针对使用不同维度的矩阵进行操作的人员的。我很困惑,任何帮助将不胜感激。
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。