柚子快報邀請碼778899分享:Numpy實現(xiàn)Conv2D
柚子快報邀請碼778899分享:Numpy實現(xiàn)Conv2D
self.stride = stride
self.input_shape = input_shape
self.trainable = True
def initialize(self, optimizer):
Initialize the weights
filter_height, filter_width = self.filter_shape
channels = self.input_shape[0]
limit = 1 / math.sqrt(np.prod(self.filter_shape))
self.W = np.random.uniform(-limit, limit, size=(self.n_filters, channels, filter_height, filter_width))
self.w0 = np.zeros((self.n_filters, 1))
Weight optimizers
self.W_opt = copy.copy(optimizer)
self.w0_opt = copy.copy(optimizer)
def parameters(self):
return np.prod(self.W.shape) + np.prod(self.w0.shape)
def forward_pass(self, X, training=True):
batch_size, channels, height, width = X.shape
self.layer_input = X
Turn image shape into column shape
(enables dot product between input and weights)
self.X_col = image_to_column(X, self.filter_shape, stride=self.stride, output_shape=self.padding)
Turn weights into column shape
self.W_col = self.W.reshape((self.n_filters, -1))
Calculate output
output = self.W_col.dot(self.X_col) + self.w0
Reshape into (n_filters, out_height, out_width, batch_size)
output = output.reshape(self.output_shape() + (batch_size, ))
Redistribute axises so that batch size comes first
return output.transpose(3,0,1,2)
def backward_pass(self, accum_grad):
Reshape accumulated gradient into column shape
accum_grad = accum_grad.transpose(1, 2, 3, 0).reshape(self.n_filters, -1)
if self.trainable:
Take dot product between column shaped accum. gradient and column shape
layer input to determine the gradient at the layer with respect to layer weights
grad_w = accum_grad.dot(self.X_col.T).reshape(self.W.shape)
The gradient with respect to bias terms is the sum similarly to in Dense layer
grad_w0 = np.sum(accum_grad, axis=1, keepdims=True)
Update the layers weights
self.W = self.W_opt.update(self.W, grad_w)
self.w0 = self.w0_opt.update(self.w0, grad_w0)
Recalculate the gradient which will be propogated back to prev. layer
accum_grad = self.W_col.T.dot(accum_grad)
Reshape from column shape to image shape
accum_grad = column_to_image(accum_grad,
self.layer_input.shape,
self.filter_shape,
stride=self.stride,
output_shape=self.padding)
return accum_grad 自我介紹一下,小編13年上海交大畢業(yè),曾經(jīng)在小公司待過,也去過華為、OPPO等大廠,18年進入阿里一直到現(xiàn)在。
深知大多數(shù)Python工程師,想要提升技能,往往是自己摸索成長或者是報班學習,但對于培訓機構動則幾千的學費,著實壓力不小。自己不成體系的自學效果低效又漫長,而且極易碰到天花板技術停滯不前!
因此收集整理了一份《2024年Python開發(fā)全套學習資料》,初衷也很簡單,就是希望能夠幫助到想自學提升又不知道該從何學起的朋友,同時減輕大家的負擔。
既有適合小白學習的零基礎資料,也有適合3年以上經(jīng)驗的小伙伴深入學習提升的進階課程,基本涵蓋了95%以上前端開發(fā)知識點,真正體系化!
由于文件比較大,這里只是將部分目錄大綱截圖出來,每個節(jié)點里面都包含大廠面經(jīng)、學習筆記、源碼講義、實戰(zhàn)項目、講解視頻,并且后續(xù)會持續(xù)更新
如果你覺得這些內容對你有幫助,可以掃碼獲取?。。。▊渥ython)
%以上前端開發(fā)知識點,真正體系化!**
由于文件比較大,這里只是將部分目錄大綱截圖出來,每個節(jié)點里面都包含大廠面經(jīng)、學習筆記、源碼講義、實戰(zhàn)項目、講解視頻,并且后續(xù)會持續(xù)更新
如果你覺得這些內容對你有幫助,可以掃碼獲取?。。。▊渥ython)
柚子快報邀請碼778899分享:Numpy實現(xiàn)Conv2D
好文閱讀
本文內容根據(jù)網(wǎng)絡資料整理,出于傳遞更多信息之目的,不代表金鑰匙跨境贊同其觀點和立場。
轉載請注明,如有侵權,聯(lián)系刪除。