最先提出深度学习算法hinton的自动编码器matlab代码
标签:
•
文件类型: .zip
•
文件大小: 22.15MB
•
下载次数: 1
•
最先提出深度学习算法hinton的自动编码器matlab代码,内容是:利用多层rbm进行自动编码的多层特征训练,然后使用梯度算法进行fine turn。可以进行特征提取,也可以进行分类。
代码片段和文件信息
属性 大小 日期 时间 名称
----------- --------- ---------- ----- ----
目录 0 2016-05-06 14:45 code
文件 5594 2006-05-21 11:34 codeackprop.m
文件 5474 2006-06-20 09:49 codeackpropclassify.m
目录 0 2016-05-06 14:45 codeackup ZIP
文件 51200 2016-04-15 10:41 codeackup ZIPAutoencoder_Code.tar
文件 8995 2016-04-15 10:41 codeackup ZIPminimize.m
文件 1648877 2016-04-15 10:40 codeackup ZIP 10k-images-idx3-ubyte.gz
文件 4542 2016-04-15 10:40 codeackup ZIP 10k-labels-idx1-ubyte.gz
文件 9912422 2016-04-15 10:39 codeackup ZIP rain-images-idx3-ubyte.gz
文件 28881 2016-04-15 10:39 codeackup ZIP rain-labels-idx1-ubyte.gz
文件 1853 2006-06-20 09:49 codeCG_CLASSIFY.m
文件 1136 2006-06-20 09:49 codeCG_CLASSIFY_INIT.m
文件 2727 2006-06-20 09:49 codeCG_MNIST.m
文件 3011 2006-06-20 09:49 codeconverter.m
文件 4169 2006-06-20 09:49 codemakebatches.m
文件 8995 2016-04-15 10:41 codeminimize.m
文件 1902 2006-06-20 09:49 codemnistclassify.m
文件 2199 2006-06-20 09:49 codemnistdeepauto.m
文件 1084 2006-06-20 09:49 codemnistdisp.m
文件 3914 2006-06-20 09:49 code
bm.m
文件 3964 2006-06-20 09:49 code
bmhidlinear.m
文件 2934 2006-07-13 23:40 codeREADME.txt
文件 7840016 1998-01-26 23:07 code 10k-images.idx3-ubyte
文件 10008 1998-01-26 23:07 code 10k-labels.idx1-ubyte
文件 47040016 1996-11-18 23:36 code rain-images.idx3-ubyte
文件 60008 1996-11-18 23:36 code rain-labels.idx1-ubyte
% Version 1.000
%
% Code provided by Ruslan Salakhutdinov and Geoff Hinton
%
% Permission is granted for anyone to copy use modify or distribute this
% program and accompanying programs and documents for any purpose provided
% this copyright notice is retained and prominently displayed along with
% a note saying that the original programs are available from our
% web page.
% The programs and documents are distributed without any warranty express or
% implied. As the programs were written for research purposes only they have
% not been tested to the degree that would be advisable in any important
% application. All use of these programs is entirely at the user‘s own risk.
% This program fine-tunes an autoencoder with backpropagation.
% Weights of the autoencoder are going to be saved in mnist_weights.mat
% and trainig and test reconstruction errors in mnist_error.mat
% You can also set maxepoch default value is 200 as in our paper.
maxepoch=200;
fprintf(1‘
Fine-tuning deep autoencoder by minimizing cross entropy error.
‘);
fprintf(1‘60 batches of 1000 cases each.
‘);
load mnistvh
load mnisthp
load mnisthp2
load mnistpo
makebatches;
[numcases numdims numbatches]=size(batchdata);
N=numcases;
%%%% PREINITIALIZE WEIGHTS OF THE AUTOENCODER %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
w1=[vishid; hidrecbiases];
w2=[hidpen; penrecbiases];
w3=[hidpen2; penrecbiases2];
w4=[hidtop; toprecbiases];
w5=[hidtop‘; topgenbiases];
w6=[hidpen2‘; hidgenbiases2];
w7=[hidpen‘; hidgenbiases];
w8=[vishid‘; visbiases];
%%%%%%%%%% END OF PREINITIALIZATIO OF WEIGHTS %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
l1=size(w11)-1;
l2=size(w21)-1;
l3=size(w31)-1;
l4=size(w41)-1;
l5=size(w51)-1;
l6=size(w61)-1;
l7=size(w71)-1;
l8=size(w81)-1;
l9=l1;
test_err=[];
train_err=[];
for epoch = 1:maxepoch
%%%%%%%%%%%%%%%%%%%% COMPUTE TRAINING RECONSTRUCTION ERROR %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
err=0;
[numcases numdims numbatches]=size(batchdata);
N=numcases;
for batch = 1:numbatches
data = [batchdata(::batch)];
data = [data ones(N1)];
w1probs = 1./(1 + exp(-data*w1)); w1probs = [w1probs ones(N1)];
w2probs = 1./(1 + exp(-w1probs*w2)); w2probs = [w2probs ones(N1)];
w3probs = 1./(1 + exp(-w2probs*w3)); w3probs = [w3probs ones(N1)];
w4probs = w3probs*w4; w4probs = [w4probs ones(N1)];
w5probs = 1./(1 + exp(-w4probs*w5)); w5probs = [w5probs ones(N1)];
w6probs = 1./(1 + exp(-w5probs*w6)); w6probs = [w6probs ones(N1)];
w7probs = 1./(1 + exp(-w6probs*w7)); w7probs = [w7probs ones(N1)];
dataout = 1./(1 + exp(-w7probs*w8));
err= err + 1/N*sum(sum( (data(:1:end-1)-dataout).^2 ));
end
train_err(epoch)=err/numbatches;
%%%%%%%%%%%%%% END OF COMPUTING TRAINING RECONSTRUCTION ERROR %%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%% DISPLAY FIGURE TOP ROW REAL DATA BOTTOM ROW RECONSTRUCTIONS %%%%%%%%%%%%%%%%%%%%%%%%%
fprintf(1‘Displaying in figure 1: Top row - real data Bottom row -- reconstructions
‘);
output=[];
for i
属性 大小 日期 时间 名称
----------- --------- ---------- ----- ----
目录 0 2016-05-06 14:45 code
文件 5594 2006-05-21 11:34 codeackprop.m
文件 5474 2006-06-20 09:49 codeackpropclassify.m
目录 0 2016-05-06 14:45 codeackup ZIP
文件 51200 2016-04-15 10:41 codeackup ZIPAutoencoder_Code.tar
文件 8995 2016-04-15 10:41 codeackup ZIPminimize.m
文件 1648877 2016-04-15 10:40 codeackup ZIP 10k-images-idx3-ubyte.gz
文件 4542 2016-04-15 10:40 codeackup ZIP 10k-labels-idx1-ubyte.gz
文件 9912422 2016-04-15 10:39 codeackup ZIP rain-images-idx3-ubyte.gz
文件 28881 2016-04-15 10:39 codeackup ZIP rain-labels-idx1-ubyte.gz
文件 1853 2006-06-20 09:49 codeCG_CLASSIFY.m
文件 1136 2006-06-20 09:49 codeCG_CLASSIFY_INIT.m
文件 2727 2006-06-20 09:49 codeCG_MNIST.m
文件 3011 2006-06-20 09:49 codeconverter.m
文件 4169 2006-06-20 09:49 codemakebatches.m
文件 8995 2016-04-15 10:41 codeminimize.m
文件 1902 2006-06-20 09:49 codemnistclassify.m
文件 2199 2006-06-20 09:49 codemnistdeepauto.m
文件 1084 2006-06-20 09:49 codemnistdisp.m
文件 3914 2006-06-20 09:49 code
bm.m
文件 3964 2006-06-20 09:49 code
bmhidlinear.m
文件 2934 2006-07-13 23:40 codeREADME.txt
文件 7840016 1998-01-26 23:07 code 10k-images.idx3-ubyte
文件 10008 1998-01-26 23:07 code 10k-labels.idx1-ubyte
文件 47040016 1996-11-18 23:36 code rain-images.idx3-ubyte
文件 60008 1996-11-18 23:36 code rain-labels.idx1-ubyte
版权声明:本文内容由互联网用户自发贡献,该文观点仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容, 请发送邮件举报,一经查实,本站将立刻删除。
评论列表(条)