投稿時間:2022-03-26 19:25:14 RSSフィード2022-03-26 19:00 分まとめ(26件)

カテゴリー等 サイト名等 記事タイトル・トレンドワード等 リンクURL 頻出ワード・要約等/検索ボリューム 登録日
TECH Engadget Japanese BenQやAnkerのプロジェクターが最大25%オフ! 据え置きタイプも、モバイルタイプも https://japanese.engadget.com/new-life-sale-projector-093433451.html anker 2022-03-26 09:34:33
TECH Engadget Japanese プロテインをAmazon新生活セールでお得ゲット! アミノ酸などのサプリも一緒にいかが? https://japanese.engadget.com/new-life-sale-protein-090434876.html amazon 2022-03-26 09:04:34
AWS lambdaタグが付けられた新着投稿 - Qiita [AWS SAM] 1つのtemplate.ymlから複数環境へのデプロイ https://qiita.com/sawa-akabee/items/d4ea2cfeced78fec2beb 2022-03-26 18:25:51
python Pythonタグが付けられた新着投稿 - Qiita HerokuにGitHub経由でDjangoアプリをデプロイする。DBにはPostgreを使う。(2022.3.26) https://qiita.com/tan0ry0shiny/items/28024ee6f01d24d61e40 DBにPostgreを使える様に書き換えるのですが、今回はenvファイルを使って環境設定を書きたいと思います。 2022-03-26 18:50:00
python Pythonタグが付けられた新着投稿 - Qiita VOICEPEAKの音声にほぼドンピシャの字幕ファイルを作成するPythonスクリプト https://qiita.com/mochi_gu_ma/items/a5a9d59865062c7479d3 作成したフォルダに音声とセリフを連番で出力する︙メニューgt出力を選択セリフをファイルに保存をONにするブロックごとに分割して保存をONにする命名規則で…を選択する手順で作成したフォルダにファイルを出力する本記事のスクリプトを使って字幕ファイルを作成する手順で作成したフォルダを指定して本記事のスクリプトを実行する字幕srtファイルを作成する関数コード全文は後述makesrtfilefoldersamplefolderfilenamesamplejimakusrtコードの内容については記事の後半で説明します。 2022-03-26 18:36:30
Ruby Rubyタグが付けられた新着投稿 - Qiita チェックボックスの作り方 https://qiita.com/murara_ra/items/70ec565a9b6561578edc チェックボックスの作り方チェックボックスの作り方を調べていると、いくつかの方法があることがわかった。 2022-03-26 18:10:39
AWS AWSタグが付けられた新着投稿 - Qiita [AWS SAM] 1つのtemplate.ymlから複数環境へのデプロイ https://qiita.com/sawa-akabee/items/d4ea2cfeced78fec2beb 2022-03-26 18:25:51
Linux CentOSタグが付けられた新着投稿 - Qiita VirtualBoxにCentOS7を入れてApache、PHP、MySQLのインストール作業を行う(LAMP) https://qiita.com/masakichi_eng/items/6a3ff25ef92934bcb016 VirtualBoxの画面に戻り「起動」→「通常起動」をクリック右側のフォルダアイコンをクリックして先ほどダウンロードしたファイルをセットします。 2022-03-26 18:50:40
Ruby Railsタグが付けられた新着投稿 - Qiita チェックボックスの作り方 https://qiita.com/murara_ra/items/70ec565a9b6561578edc チェックボックスの作り方チェックボックスの作り方を調べていると、いくつかの方法があることがわかった。 2022-03-26 18:10:39
海外TECH DEV Community Deep Learning Library From Scratch 5: Automatic Differentiation Continued https://dev.to/ashwinscode/deep-learning-library-from-scratch-5-automatic-differentiation-continued-10f6 Deep Learning Library From Scratch Automatic Differentiation ContinuedHi Guys Welcome to part of this series of building a deep learning library from scratch This post will cover the code of the automatic differentiation part of the library Automatic Differentiation was discussed in the previous post so do check it out if you don t know what Autodiff is The github repo for this series is ashwins code Zen Deep Learning Library Deep Learning library written in Python Contains code for my blog series on building a deep learning library zen deep learning libraryA deep learning library written in Python Contains the code for my blog series where we build this library from scratch View on GitHub ApproachAutomatic Differentiation relies on a computation graph to calculate derivatives Ultimately it just boils down to nodes with some connections and we traverse these nodes in some way to calculate derivatives For our library we will build our computation graph on the fly meaning any calculations performed would be recorded onto the computation graph Once we have the graph we need to find a way to use it to calculate the derivatives of all the variables in that graph For example say we produced the following graph This represents c a be c∗dc a b newlinee c d newlinec a be c∗dNow using the graph our aim is to find the derivative of e with respect all the variables in that graph a b c d e For my implementation I found it easiest to traverse the graph in a depth first manner to calculate derivatives So firstly we start at e and find dede frac de de dede​ with respect to e which is just Then we look at node c meaning we now need to calculate dedc frac de dc dcde​ We can see that ee e is the result of a multiplication between cc c and dd d meaning dedc d frac de dc ddcde​ d since we treat everything apart from the variable we are on as a constant Remembering we are traversing depth first the next node we move onto is node a meaning we calculate deda frac de da dade​ This is a bit more tricky since a does not have a direct connection to e However using the chain rule we know that deda dedcdcda frac de da frac de dc frac dc da dade​ dcde​dadc​ We just calculated dedc frac de dc dcde​ so all we need to calculate now is dcda frac dc da dadc​ We can see that c is an addition of a and b so deda dedcdcda d frac de da frac de dc frac dc da d dade​ dcde​dadc​ dHopefully you can now see how we would use a graph to find the derivatives of all the variables in that graph Tensor ClassFirstly we need to create our tensor class which would act as the variable nodes on our graph import numpy as npnp seterr invalid ignore def is matrix o return type o np ndarrayclass Tensor array priority def init self value trainable True self value value self dependencies self grads self shape self matmul product False self gradient np ones like self value self trainable trainable if is matrix value self shape value shapeWhat s going on here def is matrix o return type o np ndarraySimple function which checks if a value is a numpy array or not self value valueThis line should be self explanatory it just holds the value that is given to the tensor self dependencies If the tensor was the result of any operation e g add or divide this property would hold the list of tensors that were involved in the operation producing this tensor this is how the computation graph is built If the tensor is not a result of any operation then this is empty self grads This property would hold the list of derivative of each of the tensor s dependencies with respect to the tensor self shape if is matrix value self shape value shapeself shape holds the shape of the tensor s value Only numpy arrays can have a shape which is why it is by default self matmul product FalseSpecifies whether the tensor was a result of a matrix multiplicaton or not this will help later since the chain rule works differently for matrix multiplacation self gradient np ones like self value After we use the computation graph to calculate gradients this property would hold the gradient calculated for the tensor It is initially set to a matrix of s the same shape as its value self trainable trainableSome nodes on the graph do not need their derivatives to be calculated so this property specifies whether this is the case or not for this tensor Operations with Tensorsclass Tensor array priority def init self value trainable True self value value self dependencies self grads self grad value None self shape self matmul product False self gradient np ones like self value self trainable trainable if is matrix value self shape value shape def depends on self target if self target return True dependencies self dependencies for dependency in dependencies if dependency target return True elif dependency depends on target return True return False def mul self other if not isinstance other Tensor other Tensor other trainable False var Tensor self value other value var dependencies append self var dependencies append other var grads append other value var grads append self value return var def rmul self other if not isinstance other Tensor other Tensor other trainable False var Tensor self value other value var dependencies append self var dependencies append other var grads append other value var grads append self value return var def add self other if not isinstance other Tensor other Tensor other trainable False var Tensor self value other value var dependencies append self var dependencies append other var grads append np ones like self value var grads append np ones like other value return var def radd self other if not isinstance other Tensor other Tensor other trainable False var Tensor self value other value var dependencies append self var dependencies append other var grads append np ones like self value var grads append np ones like other value return var def sub self other if not isinstance other Tensor other Tensor other var Tensor self value other value var dependencies append self var dependencies append other var grads append np ones like self value var grads append np ones like other value return var def rsub self other if not isinstance other Tensor other Tensor other trainable False var Tensor other value self value var dependencies append other var dependencies append self var grads append np ones like other value var grads append np one like self value return var def pow self other if not isinstance other Tensor other Tensor other trainable False var Tensor self value other value var dependencies append self var dependencies append other grad wrt self other value self value other value var grads append grad wrt self grad wrt other self value other value np log self value var grads append grad wrt other return var def rpow self other if not isinstance other Tensor other Tensor other trainable False var Tensor other value self value var dependencies append other var dependencies append self grad wrt other self value other value self value var grads append grad wrt other grad wrt self other value self value np log other value var grads append grad wrt self return var def truediv self other if not isinstance other Tensor other Tensor other trainable False var Tensor self value other value var dependencies append self var dependencies append other grad wrt self other value var grads append grad wrt self grad wrt other self value other value var grads append grad wrt other return var def rtruediv self other if not isinstance other Tensor other Tensor other trainable False var Tensor other value self value var dependencies append other var dependencies append self grad wrt other self value var grads append grad wrt other grad wrt self other value self value var grads append grad wrt self return var def matmul self other if not isinstance other Tensor other Tensor other trainable False var Tensor self value other value var dependencies append self var dependencies append other var grads append other value T var grads append self value T var matmul product True return var def rmatmul self other if not isinstance other Tensor other Tensor other trainable False var Tensor other value self value var dependencies append other var dependencies append self var grads append self value T var grads append other value T var matmul product True return varTo understand what is happening here let s look at one of the methodsdef mul self other if not isinstance other Tensor other Tensor other trainable False var Tensor self value other value var dependencies append self var dependencies append other var grads append other value var grads append self value return varFirstly mul is an operator overloader This just means that whenever we want to multiply a tensor with something else this method would be called You also see rmul which is the same thing but is called when the Tensor object is on the right hand side of the operation For example t Tensor t mul is called t rmul is calledif not isinstance other Tensor other Tensor other trainable False other represents the thing that this tensor is being multiplied with If other is not a tensor we want to convert it to a tensor holding the value of other Since other was not already a tensor it means that it is a constant so we do not need to calculate its derivative since we only want to calculate the derivative of something if we need to change its value usually when training a model This is why trainable False var Tensor self value other value var dependencies append self var dependencies append other var holds the resulting tensor of this operation The second and third lines add the two tensors used in this operator to var s dependencies look back up if you forgot what the dependencies property is used for var grads append other value dvar dothervar grads append self value dvar dselfreturn varThis now adds the derivates of the two operands with respect to var to var s grads These lines would obviously be different in the other class methods since the derivatives depend on the operation being applied in this case it s multiplication Note how the order of how they re added corresponds with the order of the tensors in the dependencies property We don t store them as tensors since it will be much quicker to calculate derivatives using raw values instead of our tensor class Calculating gradients using the graph def get gradients self grad None grad np ones like self value if grad is None else grad grad np float grad for dependency grad in zip self dependencies self grads if dependency trainable local grad np float grad if self matmul product if dependency self dependencies local grad grad local grad else local grad local grad grad else if dependency shape and not same shape grad shape local grad shape ndims added grad ndim local grad ndim for in range ndims added grad grad sum axis for i dim in enumerate dependency shape if dim grad grad sum axis i keepdims True local grad np nan to num grad dependency gradient local grad dependency get gradients local grad This method recursively implements the depth first derivative calculating that was outlined earlier This can be called for any tensor in the graph not just the top one It would result in all the derivatives of all the tensors that were used to produce this tensor being calculated grad holds the incoming gradients the previously calculated gradient in the previous level of the graph for dependency grad in zip self dependencies self grads if dependency trainable local grad np float grad if self matmul product if dependency self dependencies local grad grad local grad else local grad local grad grad else if dependency shape and not same shape grad shape local grad shape ndims added grad ndim local grad ndim for in range ndims added grad grad sum axis for i dim in enumerate dependency shape if dim grad grad sum axis i keepdims True local grad np nan to num grad This part of the code goes through each of the tensor s dependencies along with the derivative the tensor with respect to that dependency held in grad If the dependency is trainable it then checks if it was a result of a matrix multiplication or not The chain rule is then accordingly applied using the previously calculated gradient grad and the gradient of the current dependency grad local grad stores the result after the chain rule is applied def same shape s s for a b in zip s s if a b return False return Trueif dependency shape and not same shape grad shape local grad shape ndims added grad ndim local grad ndim for in range ndims added grad grad sum axis for i dim in enumerate dependency shape if dim grad grad sum axis i keepdims True If we focus on this part of the code this handles any case where local grad and grad aren t the same shape they need to be in order for chain rule to be applied This shape mismatch arises if there was any broadcasting performed in any of the calculations Broadcasting is term used to describe how numpy would perform operations involving arrays of different shapes You can read more about it on the numpy docs All this part of the code does is sum across the broadcasted axis of grad in order to reduce its shape to match the shape of local grad dependency gradient local graddependency get gradients local grad The gradient is then recorded to the gradient property of the dependency It then continues the depth first traversal by calling the get gradients method on the dependency passing through the gradient that was just calculated Overall our code for autodiff should look like this import numpy as npnp seterr invalid ignore def is matrix o return type o np ndarraydef same shape s s for a b in zip s s if a b return False return Trueclass Tensor array priority def init self value trainable True self value value self dependencies self grads self grad value None self shape self matmul product False self gradient np ones like self value self trainable trainable if is matrix value self shape value shape def depends on self target if self target return True dependencies self dependencies for dependency in dependencies if dependency target return True elif dependency depends on target return True return False def mul self other if not isinstance other Tensor other Tensor other trainable False var Tensor self value other value var dependencies append self var dependencies append other var grads append other value var grads append self value return var def rmul self other if not isinstance other Tensor other Tensor other trainable False var Tensor self value other value var dependencies append self var dependencies append other var grads append other value var grads append self value return var def add self other if not isinstance other Tensor other Tensor other trainable False var Tensor self value other value var dependencies append self var dependencies append other var grads append np ones like self value var grads append np ones like other value return var def radd self other if not isinstance other Tensor other Tensor other trainable False var Tensor self value other value var dependencies append self var dependencies append other var grads append np ones like self value var grads append np ones like other value return var def sub self other if not isinstance other Tensor other Tensor other var Tensor self value other value var dependencies append self var dependencies append other var grads append np ones like self value var grads append np ones like other value return var def rsub self other if not isinstance other Tensor other Tensor other trainable False var Tensor other value self value var dependencies append other var dependencies append self var grads append np ones like other value var grads append np one like self value return var def pow self other if not isinstance other Tensor other Tensor other trainable False var Tensor self value other value var dependencies append self var dependencies append other grad wrt self other value self value other value var grads append grad wrt self grad wrt other self value other value np log self value var grads append grad wrt other return var def rpow self other if not isinstance other Tensor other Tensor other trainable False var Tensor other value self value var dependencies append other var dependencies append self grad wrt other self value other value self value var grads append grad wrt other grad wrt self other value self value np log other value var grads append grad wrt self return var def truediv self other if not isinstance other Tensor other Tensor other trainable False var Tensor self value other value var dependencies append self var dependencies append other grad wrt self other value var grads append grad wrt self grad wrt other self value other value var grads append grad wrt other return var def rtruediv self other if not isinstance other Tensor other Tensor other trainable False var Tensor other value self value var dependencies append other var dependencies append self grad wrt other self value var grads append grad wrt other grad wrt self other value self value var grads append grad wrt self return var def matmul self other if not isinstance other Tensor other Tensor other trainable False var Tensor self value other value var dependencies append self var dependencies append other var grads append other value T var grads append self value T var matmul product True return var def rmatmul self other if not isinstance other Tensor other Tensor other trainable False var Tensor other value self value var dependencies append other var dependencies append self var grads append self value T var grads append other value T var matmul product True return var def get gradients self grad None grad np ones like self value if grad is None else grad grad np float grad for dependency grad in zip self dependencies self grads if dependency trainable local grad np float grad if self matmul product if dependency self dependencies local grad grad local grad else local grad local grad grad else if dependency shape and not same shape grad shape local grad shape ndims added grad ndim local grad ndim for in range ndims added grad grad sum axis for i dim in enumerate dependency shape if dim grad grad sum axis i keepdims True local grad np nan to num grad dependency gradient local grad dependency get gradients local grad def repr self return f Tensor self value and it can be used like so a Tensor b Tensor c d a b cd get gradients print a gradient b gradient OUTPUT Thank youThank you for reading through all of this post The code of this post can be seen in the Github repo linked at the start in autodiff py 2022-03-26 09:18:07
海外TECH DEV Community How im learning Go and GraphQL (super simple clickbait title ) https://dev.to/matteol/how-im-learning-go-and-graphql-super-simple-clickbait-title--440o How im learning Go and GraphQL super simple clickbait title In my opinion the best way to study new techologies is to make projects so while studying Go and GraphQL i decided to create a basic One Piece famous manga API that returns the Strawhats crew information and give you the possibility to create new crews is a stupid and simple api but im proud anyway if you want you can help me to improve it and add functionalities Contributions are what make the open source community such an amazing place to learn inspire and create Any contributions you make are greatly appreciated 2022-03-26 09:02:30
海外ニュース Japan Times latest articles Kishida voices ‘serious concern’ over Russian nuclear threat in visit to Hiroshima with U.S. envoy https://www.japantimes.co.jp/news/2022/03/26/national/politics-diplomacy/rahm-emanuel-fumio-kishida-hiroshima/ Kishida voices serious concern over Russian nuclear threat in visit to Hiroshima with U S envoyThe prime minister and U S Ambassador to Japan Rahm Emanuel also reaffirmed that they would continue to work toward a world without nuclear weapons 2022-03-26 18:22:11
海外ニュース Japan Times latest articles Retailers in Japan working to reduce single-use plastics ahead of new law https://www.japantimes.co.jp/news/2022/03/26/business/single-use-plastic-cut/ Retailers in Japan working to reduce single use plastics ahead of new lawUnder the law convenience stores will be required to reduce the usage of single use plastic items that are distributed free of charge 2022-03-26 18:41:21
海外ニュース Japan Times latest articles Central League pitchers get into swing of things at plate https://www.japantimes.co.jp/sports/2022/03/26/baseball/japanese-baseball/cl-pitchers-hit/ central 2022-03-26 18:34:04
ニュース BBC News - Home Taylor Hawkins: Foo Fighters' drummer dies aged 50 https://www.bbc.co.uk/news/entertainment-arts-60884259?at_medium=RSS&at_campaign=KARANGA hawkins 2022-03-26 09:24:48
ニュース BBC News - Home Man and woman die in Nottingham house fire https://www.bbc.co.uk/news/uk-england-nottinghamshire-60885915?at_medium=RSS&at_campaign=KARANGA firepolice 2022-03-26 09:33:25
ニュース BBC News - Home F1 drivers met for four hours before agreeing to race in Saudi Arabia despite missile attack https://www.bbc.co.uk/sport/formula1/60885031?at_medium=RSS&at_campaign=KARANGA F drivers met for four hours before agreeing to race in Saudi Arabia despite missile attackThe Saudi Arabian Grand Prix will go ahead after Formula drivers agreed to race despite security concerns after a nearby missile attack 2022-03-26 09:51:33
ニュース BBC News - Home Michael Bisping: Former UFC champion 'lost identity' after losing sight in one eye https://www.bbc.co.uk/sport/av/mixed-martial-arts/60879665?at_medium=RSS&at_campaign=KARANGA Michael Bisping Former UFC champion x lost identity x after losing sight in one eyeFormer UFC middleweight champion Michael Bisping tells BBC Sport s Paul Battison he lost his identity after losing his vision in one eye 2022-03-26 09:02:09
北海道 北海道新聞 ロシア、軍以外の「虚偽情報」にも罰 海外の政府機関対象 https://www.hokkaido-np.co.jp/article/661485/ 政府機関 2022-03-26 18:32:22
北海道 北海道新聞 首相、駐日米大使と広島訪問 「核使用あってはならず」 https://www.hokkaido-np.co.jp/article/661548/ 駐日大使 2022-03-26 18:09:24
北海道 北海道新聞 森保監督「応援に感謝の気持ち」 W杯出場で記者会見 https://www.hokkaido-np.co.jp/article/661569/ 記者会見 2022-03-26 18:09:24
北海道 北海道新聞 ルヴァン杯、C大阪が大勝 1次リーグ第3節、鹿島は初勝利 https://www.hokkaido-np.co.jp/article/661573/ 鹿島 2022-03-26 18:13:00
北海道 北海道新聞 北斗茂辺地―木古内開通 函館・江差自動車道 https://www.hokkaido-np.co.jp/article/661572/ 北斗茂辺地インターチェンジ 2022-03-26 18:13:00
北海道 北海道新聞 巨7―5中(26日) 巨人、八回5得点で逆転 https://www.hokkaido-np.co.jp/article/661571/ 逆転勝ち 2022-03-26 18:12:00
北海道 北海道新聞 ソ6―3日(26日) 日本ハム五回に5失点 https://www.hokkaido-np.co.jp/article/661559/ 日本ハム 2022-03-26 18:08:12
北海道 北海道新聞 埼玉5歳男児遺棄、母らを再逮捕 床に投げ飛ばす、傷害致死疑い https://www.hokkaido-np.co.jp/article/661570/ 傷害致死 2022-03-26 18:04:00

コメント

このブログの人気の投稿

投稿時間:2021-06-17 05:05:34 RSSフィード2021-06-17 05:00 分まとめ(1274件)

投稿時間:2021-06-20 02:06:12 RSSフィード2021-06-20 02:00 分まとめ(3871件)

投稿時間:2020-12-01 09:41:49 RSSフィード2020-12-01 09:00 分まとめ(69件)