{"id":1052,"date":"2017-09-13T11:26:51","date_gmt":"2017-09-13T11:26:51","guid":{"rendered":"http:\/\/blog.tiran.info\/?p=1052"},"modified":"2017-09-13T11:26:51","modified_gmt":"2017-09-13T11:26:51","slug":"annore-3","status":"publish","type":"post","link":"https:\/\/blog.tiran.stream\/?p=1052","title":{"rendered":"ANN\/ORE #3 &#8211; Construction du r\u00e9seau de neurones avec neuralnet"},"content":{"rendered":"<p style=\"text-align: justify;\">Dans les <a href=\"http:\/\/blog.tiran.info\/annore-2\" target=\"_blank\" rel=\"noopener\">billets pr\u00e9c\u00e9dents<\/a>, j&rsquo;ai utilis\u00e9 le package nnet. Celui-ci ne permet que la cr\u00e9ation de r\u00e9seaux de neurones simples (une seule couche cach\u00e9e). Pour construire un ANN plus complexe, d&rsquo;autres packages sont disponibles sous R, ici je vais utiliser \u00ab\u00a0<a href=\"https:\/\/cran.r-project.org\/web\/packages\/neuralnet\/\" target=\"_blank\" rel=\"noopener\">neuralnet<\/a>\u00a0\u00bb avec, l\u00e0 encore, les donn\u00e9es du MNIST.<\/p>\n<p style=\"text-align: justify;\">Contrairement \u00e0 nnet, la librairie neuralnet doit \u00eatre install\u00e9e. Je proc\u00e8de donc \u00e0 sa mise en place sur la distribution ORE du serveur de base de donn\u00e9es:<\/p>\n<pre class=\"brush: js; ruler: true;\"> \noracle@psu888: \/home\/oracle [HODBA04D1_1]# ORE CMD INSTALL \/tmp\/neuralnet_1.33.tar.gz\n* installing to library \u2018\/soft\/oracle\/product\/rdbms\/12.2.0.1\/R\/library\u2019\n* installing *source* package \u2018neuralnet\u2019 ...\n** package \u2018neuralnet\u2019 successfully unpacked and MD5 sums checked\n** R\n** preparing package for lazy loading\n** help\n*** installing help indices\n  converting help for package \u2018neuralnet\u2019\n    finding HTML links ... done\n    compute                                 html\n    confidence.interval                     html\n    gwplot                                  html\n    neuralnet-package                       html\n    neuralnet                               html\n    plot.nn                                 html\n    prediction                              html\n** building package indices\n** testing if installed package can be loaded\n* DONE (neuralnet)\noracle@psu888: \/home\/oracle [HODBA04D1_1]#\n<\/pre>\n<p style=\"text-align: justify;\">Le package peut alors \u00eatre utilis\u00e9 dans des appels ore &#8211; deux points \u00e0 noter cependant:<\/p>\n<ul>\n<li>les expressions du type \u00ab\u00a0y ~ .\u00a0\u00bb ne sont pas support\u00e9es, il faut donc sp\u00e9cifier explicitement les pr\u00e9dicteurs et les variables d\u00e9pendantes via une formule.<\/li>\n<li>les facteurs ne sont pas support\u00e9s, il faut donc passer par une \u00e9tape de codification one-hot<\/li>\n<\/ul>\n<p>Je vais maintenant construire un ANN avec deux couches cach\u00e9es &#8211; la premi\u00e8re constitu\u00e9e de 300 neurones, la seconde de 100 neurones:<\/p>\n<pre class=\"brush: js; ruler: true;\"> \n&gt; library(tictoc)\n&gt; tic()\n&gt; \n&gt; ore.doEval(function() {\n+   library(ORE)\n+   library(neuralnet)\n+   set.seed(3456)\n+   ore.sync(table = &quot;MNIST_TRAINING_SET&quot;)\n+   mnist_training &lt;- ore.pull(ore.get(&quot;MNIST_TRAINING_SET&quot;))\n+   \n+   # -- One Hot Encoding du champ IMG_LBL\n+   mnist_training_ohe &lt;- as.data.frame(model.matrix(~.-1,mnist_training))\n+   \n+   # -- Construction de la formule en sp\u00e9cifiant les champs\n+   f &lt;- as.formula(paste(paste(paste(&quot;IMG_LBL&quot;,seq(0,9),sep=&quot;&quot;),collapse=&quot;+&quot;), &quot; ~&quot;, paste(paste(&quot;P&quot;,seq(1,784),sep=&quot;&quot;),collapse=&quot;+&quot;)))\n+   \n+   # -- Construction du mod\u00e8le\n+   nn_neuralnet &lt;- neuralnet(f, data=mnist_training_ohe,\n+                             hidden=c(300, 100),\n+                             linear.output=FALSE)\n+   \n+   # -- Sauvegarde du mod\u00e8le\n+   ore.save(list=c(&quot;nn_neuralnet&quot;),name=&quot;DS NeuralNet&quot;, append = TRUE)\n+   \n+   # -- Scoring du mod\u00e8le\n+   ore.sync(table = &quot;MNIST_TEST_SET&quot;)\n+   mnist_test_orig &lt;- ore.pull(ore.get(&quot;MNIST_TEST_SET&quot;))\n+   mnist_test &lt;- mnist_test_orig[,c(-1,-2)]\n+   mnist_pred &lt;- compute(nn_neuralnet,mnist_test)$net.result\n+   \n+   # -- Reversibilit\u00e9 de l&#039;encodage One-Hot\n+   names(mnist_pred) &lt;- c(&quot;0&quot;,&quot;1&quot;,&quot;2&quot;,&quot;3&quot;,&quot;4&quot;,&quot;5&quot;,&quot;6&quot;,&quot;7&quot;,&quot;8&quot;,&quot;9&quot;)\n+   pred &lt;- names(mnist_pred)[max.col(mnist_pred)]\n+   \n+   # -- Table de contingence\n+   print(table(pred,mnist_test_orig[1]$IMG_LBL))\n+ }, ore.connect = TRUE)\n    \npred    0    1    2    3    4    5    6    7    8    9\n   0  966    0    8    2    1    3    8    3    9    9\n   1    0 1121    2    3    2    1    3    3    1    4\n   2    3    2  977    8    6    3    3   12    9    0\n   3    1    2   11  952    2   13    1    6   17   11\n   4    1    1    4    1  934    4    5    4    5   15\n   5    3    0    1    7    1  847   12    1   12    4\n   6    2    3    5    1    7    7  924    0    5    0\n   7    2    2   10   11    5    2    0  987    4   12\n   8    1    4   13   18    2    8    2    2  909    5\n   9    1    0    1    7   22    4    0   10    3  949\n&gt; \n&gt; toc()\n2385.31 sec elapsed\n&gt; ore.datastoreSummary(&quot;DS NeuralNet&quot;)\n   object.name        class        size length row.count col.count\n1      nn_nnet nnet.formula   407018410     19        NA        NA   \n2 nn_neuralnet           nn 23492644634     13        NA        NA\n&gt;\n<\/pre>\n<p style=\"text-align: justify;\">A l&rsquo;instar de nnet, on peut voir avec la table de contingence qu&rsquo;on obtient d&rsquo;excellents r\u00e9sultats de classification.<\/p>\n<p style=\"text-align: justify;\">En revanche, on peut remarquer deux aspects int\u00e9ressants:<\/p>\n<ul style=\"text-align: justify;\">\n<li>la dur\u00e9e de construction du mod\u00e8le est drastiquement r\u00e9duite par rapport au test avec nnet: 2385 secondes contre 49865 &#8211; soit une diminution de 95%!! On s&rsquo;int\u00e9ressera \u00e0 cet aspect dans un futur billet.<\/li>\n<li>la taille du mod\u00e8le sauvegard\u00e9 est gigantesque par rapport \u00e0 celui produit par nnet: 23.5GB contre 400MB.<\/li>\n<\/ul>\n<p style=\"text-align: justify;\">Cela s&rsquo;av\u00e8re malheureusement source de probl\u00e8mes dans la mesure ou le rechargement du mod\u00e8le requiert \u00e9norm\u00e9ment de temps et de m\u00e9moire:<\/p>\n<pre class=\"brush: js; ruler: true;\"> \n&gt; tic()\n&gt; \n&gt; ore.doEval(function() {\n+   library(ORE)\n+   library(neuralnet)\n+   ore.load(&quot;DS NeuralNet&quot;, list = c(&quot;nn_neuralnet&quot;))\n+ }, ore.connect = TRUE)\nError in .oci.GetQuery(conn, statement, data = data, prefetch = prefetch,  : \n  ORA-20000: RQuery error\nError : vector memory exhausted (limit reached?)\nORA-06512: at &quot;RQSYS.RQEVALIMPL&quot;, line 104\nORA-06512: at &quot;RQSYS.RQEVALIMPL&quot;, line 101\n&gt; \n&gt; toc()\n7044.19 sec elapsed\n&gt; \n&gt; ore.doEval(gc)\n          used (Mb) gc trigger    (Mb) limit (Mb)   max used    (Mb)\nNcells 1830257 97.8   17095973   913.1       5600   30228485  1614.4\nVcells 2547251 19.5 2497018182 19050.8      32768 3901465136 29765.9\n&gt;\n<\/pre>\n<p style=\"text-align: justify;\">Ici, le rechargement a \u00e9chou\u00e9 au bout d&rsquo;un peu moins de 2 heures en raison d&rsquo;une saturation de la m\u00e9moire allou\u00e9e au Vcells&#8230;<\/p>\n<p style=\"text-align: justify;\">En conclusion, le package neuralnet permet de construire des ANN de typologie plus complexe que nnet. Si le package est extr\u00eamement performant lors de la phase de construction du mod\u00e8le, la r\u00e9-exploitation du mod\u00e8le une fois sauvegard\u00e9 s&rsquo;av\u00e8re probl\u00e9matique.<\/p>\n<p style=\"text-align: justify;\">A suivre&#8230;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Dans les billets pr\u00e9c\u00e9dents, j&rsquo;ai utilis\u00e9 le package nnet. Celui-ci ne permet que la cr\u00e9ation de r\u00e9seaux de neurones simples<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"colormag_page_container_layout":"default_layout","colormag_page_sidebar_layout":"default_layout","footnotes":""},"categories":[2,3,9],"tags":[],"class_list":["post-1052","post","type-post","status-publish","format-standard","hentry","category-ann","category-classification","category-oracle-r-enterprise"],"_links":{"self":[{"href":"https:\/\/blog.tiran.stream\/index.php?rest_route=\/wp\/v2\/posts\/1052","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blog.tiran.stream\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blog.tiran.stream\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blog.tiran.stream\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/blog.tiran.stream\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=1052"}],"version-history":[{"count":0,"href":"https:\/\/blog.tiran.stream\/index.php?rest_route=\/wp\/v2\/posts\/1052\/revisions"}],"wp:attachment":[{"href":"https:\/\/blog.tiran.stream\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=1052"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blog.tiran.stream\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=1052"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blog.tiran.stream\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=1052"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}