投稿時間:2021-12-26 21:11:01 RSSフィード2021-12-26 21:00 分まとめ(18件)

カテゴリー等 サイト名等 記事タイトル・トレンドワード等 リンクURL 頻出ワード・要約等/検索ボリューム 登録日
python Pythonタグが付けられた新着投稿 - Qiita NumpyでCannyエッジ検出 https://qiita.com/aa_debdeb/items/035564db05fcb79a1b57 まず、値が閾値Thighよりも大きい画素をエッジとみなします。 2021-12-26 20:37:03
js JavaScriptタグが付けられた新着投稿 - Qiita RPGツクールMVの任意スクリプト実行の準備を自動化してみた https://qiita.com/qwe001/items/cf5db359c912775b22e9 cheerioはバッチファイル内でローカルインストールしますが、何かうまくいかなかったらグローバルインストールしてみてください。 2021-12-26 20:05:42
Program [全てのタグ]の新着質問一覧|teratail(テラテイル) apacheとWSGIの連携ができない https://teratail.com/questions/375644?rss=all apacheとWSGIの連携ができないCentoOS上でApacheとWSGIを使用してWebサーバーを作る環境構築をしているのですが、ApacheにWSGIを連携ができません。 2021-12-26 20:44:11
Program [全てのタグ]の新着質問一覧|teratail(テラテイル) チャンネルスキャンができない https://teratail.com/questions/375643?rss=all チャンネルスキャンができない前提・実現したいことdockermirakurunepgstationを使って録画サーバを作っているのですが、チャンネルスキャンで、quotquotnonbspavailablenbsptunersquotquotとでで、スキャンができません。 2021-12-26 20:34:53
Program [全てのタグ]の新着質問一覧|teratail(テラテイル) スマートな書き方を教えてください。 https://teratail.com/questions/375642?rss=all スマートな書き方を教えてください。 2021-12-26 20:12:16
Ruby Rubyタグが付けられた新着投稿 - Qiita Railsでmysql2 gemのbundler installに失敗した時の対処法。LibreSSL→OpenSSLで解決 https://qiita.com/hajsu00/items/5be5d9da574529a4194e brewinfoopensslopensslstablebottledkegonlyCryptographyandSSLTLSToolkitNotinstalledltOpenSSLはインストールされていないFromrbLicenseApache中略HomebrewとしてはOpenSSLのが存在することを認知しているけど、インストールはされていないよ、ということだと解釈。 2021-12-26 20:13:49
AWS AWSタグが付けられた新着投稿 - Qiita WordPress + AWS WAF設定時の注意点 https://qiita.com/holdout0521/items/252705c33914dcab000c WordPressAWSWAF設定時の注意点はじめにWordpressサーバーを運用するにあたり、セキュリティー対策として、AWSWAFを導入する機会があったのですが、思わぬエラーが出たため、書き留めます。 2021-12-26 20:20:44
AWS AWSタグが付けられた新着投稿 - Qiita 【CloudFormation】AWS Secrets Managerを使いRDS Proxy経由でRDSに接続する方法 https://qiita.com/HORI_TOMO/items/22aab25efcbc6d25655c 参考なぜAWSLambdaとRDBMSの相性が悪いかを簡単に説明する要件と手順要件は以下の通りDBはPostgreSQLを選択ユーザーの認証はAWSSecretsManagerを使用するRDSのエンドポイント名をホスト名として、ドメイン名と紐付けしてレコードセットを行う手順は以下の通りセキュリティグループの設定IAMロール作成RDSインスタンスとRDSプロキシ作成SecretsManagerの作成Routeでレコード設定セキュリティグループの設定今回はECとLambdaからプロキシを通してRDS接続するので、EC・Lambdaからのアクセス先がプロキシになるので、プロキシ側でEC・Lambdaのインバウンド許可し、プロキシのターゲット先がRDSになるので、RDS側でプロキシのインバウンド許可します。 2021-12-26 20:02:38
Docker dockerタグが付けられた新着投稿 - Qiita 1コンテナ複数プロセスはやめておいた方が良い話 https://qiita.com/kazurego7/items/57f5fb80b4783b7633a1 コンテナー内での複数サービス起動上記ドキュメントのラッパースクリプトを利用する方法には重大な問題があり、本番環境で使用するべきではありません。 2021-12-26 20:08:41
Ruby Railsタグが付けられた新着投稿 - Qiita Railsでmysql2 gemのbundler installに失敗した時の対処法。LibreSSL→OpenSSLで解決 https://qiita.com/hajsu00/items/5be5d9da574529a4194e brewinfoopensslopensslstablebottledkegonlyCryptographyandSSLTLSToolkitNotinstalledltOpenSSLはインストールされていないFromrbLicenseApache中略HomebrewとしてはOpenSSLのが存在することを認知しているけど、インストールはされていないよ、ということだと解釈。 2021-12-26 20:13:49
海外TECH MakeUseOf How to Select All on a Mac https://www.makeuseof.com/mac-how-to-select-all/ different 2021-12-26 11:30:12
海外TECH DEV Community Tensorflow Lyrics Generation https://dev.to/ashwinscode/tensorflow-lyrics-generation-342f Tensorflow Lyrics GenerationHi Welcome to this post about lyrics text generation in TensorflowThe project described in this post can be found on my Github hereAnd here this my Github profile hereI d appreciate any feedback on anything on my profile and if you look anything you see please leave a star on it too AimWhat is our aim in this post We want to create a bot that given a starting phrase would generate its own lyrics powered by a machine learning model that would have learned from the lyrics of previously written songs How will we go about doing this There are main steps we have to takePrepare our training dataBuild our machine learning modelTrain and use our model How will our model workBefore we do anything we must think about how our model would work since this would then tell us how we should go about preparing our training data Our model will take a one hot encoded sequence of characters and will try to predict the next character in the sequence based on the characters before Example input hello worl input gt model gt d next predicted character NoteOne hot encoding is a way of vectorising data where the data can be categorised and each category has an integer ID In our case for the sequence of characters we can assign each unique character their own ID we have categorised the text into unique characters For example a b c One hot encoding takes these IDs and represents them as a vector This vector has a length equal to the number of different categories The vector consists of all zeroes except the index of the corresponding ID which is populated with a For example if we wanted to encode the letter a We know it s ID is and that there are total categories since there are letters in the alphabet So the one hot encoding would be a vector of length with index being a and the rest being a a Similarly for b we know the ID for it is so the encoding would be a vector of length and index being a b Preparing the training dataFor my project I decided to use Metallica songs as the dataset for the model to train on NoteThis is quite a small dataset for machine learning standards so our model wouldn t produce amazing results However it allows for quicker training times and we would get to see results quicker If you would like to have a much more accurate model I would suggest using a larger dataset I saved all the lyrics as text files for each song and named them as the following data txtdata txtdata txt data txtNow we need to process out data into inputs and outputs Our inputs are a sequence of characters and the outputs are characters that should come next in the input sequence We can process our texts by taking each substring of a chosen length in our text and splitting it so that the last character is the output and the rest of the characters in the substring are the input sequence For example if there was a substring tensorflow is cool this would be split as suchinput sequence tensorflow is coo output l We do this process for every substring in our lyrics data We can encode both the input and outputs and put them into input output arrays In my project I chose for the input character sequence to be length Here is the code for prepping our datasetdef get character count returns the number of possible characters alphabet get alphabet return len alphabet def get alphabet returns the list of all characters we will allow from our dataset the lower case alphabet spaces and new lines return list abcdefghijklmnopqrstuvwxyz n def text to vector text takes in a text and returns it as a sequence of one hot encodings representing each character in the text alphabet get alphabet vector for char in text if char lower in alphabet one hot get character count index alphabet index char lower one hot index vector append one hot return vectordef prep dataset file this function takes the file name of where certain text data is stored and returns the input sequences array and output characters array text open file r read vec text to vector text one hot encoding the text xs input sequence array ys output character array i while i lt len vec loop for finding each substring of length x vec i i input sequence y vec i output character xs append x ys append y i return xs ysif name main x input sequences y output characters for i in range goes through all the dataset files and adds the inputs and outputs to x and y a b prep dataset f data i txt for i in a x append i for i in b y append i Building our modelNow that we have prepared our data we can build our model Remember our model will take a sequence of characters and will predict the next character in that sequence When dealing with sequential data it is best to use recurrent neural networks If you don t know how a normal neural network works I would suggest researching how they work first Recurrent neural networks are very useful when working with sequential data In sequential data each data point is influenced by the data points before it so for predicting the next thing to come in a sequence having context is crucial Normal feed forward neural networks simply can not model sequential data since they only pass data from layer to layer so no notion of time is considered Recurrent neural networks however have layers that loop their outputs back into themselves which allows for the network to have context The layer looks at each element time step in the sequence and produces an output and what is known as a hidden state This hidden state is then passed back into the layer when it looks at the next time step which preserves context Vanishing GradientsRNNs however suffer from short term memory loss This means that information from far back in the sequence gets lost as the time step increases This is caused by vanishing gradients When a neural network trains it calculates the derivative gradient of its loss function with respect to all its weights This gradient is then used to adjust the weights As the loss is backpropagated through each layer the gradient gets smaller and smaller meaning that it will have a small effect on the weights in those layers The early layers of a neural network do very little learning because of this With RNNs this vanishing gradient means that early time steps in a sequence and forgot about by the network so have no influence in the output This can be fixed by using LSTMs and GRUs which are special types of RNNs which solve the vanishing gradient problem They have gates which determine what to preserve or remove from the hidden states it receives which allow it to have long term memory This post here explains RNNs to greater detail With the theory out the way we can use Tensorflow to build our model The code should be self explanatory if you are familiar with the Tensorflow API def build model model tf keras Sequential tf keras layers LSTM input dim get character count return sequences True tf keras layers Dropout tf keras layers Bidirectional tf keras layers LSTM tf keras layers Dense tf keras layers Dense get character count activation softmax model compile loss tf keras losses CategoricalCrossentropy optimizer tf keras optimizers Adam return modeldef train model model x y print Training model fit x y epochs model save save Training and using our modelTo train our model all we have to do is add a few more lines to our code Here is what our final code for training our model will look like train pyimport numpy as npimport tensorflow as tfdef get character count returns the number of possible characters alphabet get alphabet return len alphabet def get alphabet returns the list of all characters we will allow from our dataset the lower case alphabet spaces and new lines return list abcdefghijklmnopqrstuvwxyz n def text to vector text takes in a text and returns it as a sequence of one hot encodings representing each character in the text alphabet get alphabet vector for char in text if char lower in alphabet one hot get character count index alphabet index char lower one hot index vector append one hot return vectordef prep dataset file this function takes the file name of where certain text data is stored and returns the input sequences array and output characters array text open file r read vec text to vector text one hot encoding the text xs input sequence array ys output character array i while i lt len vec loop for finding each substring of length x vec i i input sequence y vec i output character xs append x ys append y i return xs ysdef build model model tf keras Sequential tf keras layers LSTM input dim get character count return sequences True tf keras layers Dropout tf keras layers Bidirectional tf keras layers LSTM tf keras layers Dense tf keras layers Dense get character count activation softmax model compile loss tf keras losses CategoricalCrossentropy optimizer tf keras optimizers Adam return modeldef train model model x y print Training model fit x y epochs model save save if name main model build model x input sequences y output characters for i in range goes through all the dataset files and adds the inputs and outputs to x and y a b prep dataset f data i txt for i in a x append i for i in b y append i train model model np array x dtype float np array y dtype float Name that file train py And now all we need to do is use our model We want our bot to ask the user for an input string and we will use our model to produce some lyrics However since our model only produces one letter at a time we would need to do the following Start with input sequence Pass input sequence to model to predict next character Add this character to the input sequence and drop off the first letter of the sequence Repeat steps and however times you want to produce a set of lyrics run pyimport tensorflow as tf import numpy as npfrom train import get alphabet text to vectorfrom autocorrect import Spellerspell Speller def gen text model inp len inp input sequence len no of characters to produce alphabet get alphabet res inp final output for i in range len vec text to vector inp encoding the input vec np expand dims vec axis formatting it so it matches the input shape for our model index np argmax model predict vec passing the input to our model letter alphabet index decoding our output to a letter res letter adding the letter to our output string inp letter adding the letter to the input sequence inp inp dropping off the first letter of input sequence return spell res return spell checked outputmodel tf keras models load model save while True print print gen text model input Enter seed phrase print Since we are producing text on character level there are bound to be a quite a few spelling mistakes I decided to use an autocorrect library in order to clean up our resulting text Here are the resultspython run py Tensorflow warnings Enter seed phrase Never will it mendNever will it mendnow the truth of meof liveall silence the existcannot kill the the familybatteryneverfireto begin whipping oneno nothing no the matters breathoh it so met mor the role me can seeand it just free the findnever will the timenothing is the ear firetruth wind to seeman me will the deathwriting dawn aninimine in mecannot justice the batterypounding either as taken my streamto the will is the existing there is boremake it our lothenentborn one row the better the existing fro Enter seed phrase hold my battery of breathhold my battery of breath of eyes to set deathoh straw hat your humanitylate the ust comes before but they sunever cared to bei the estimate it life the lost fill deadso redso truebatteryno nothing life now i me crossing ftindareso true myself in menow pain i meanso net wouldto beno ripped to areso prmdimply solute more is to you heartaken my endtruth the within so let it be worthtro findingsomethingmutilation cancellation cancellationaustinso let it be resting spouses the stanserve goth As you can see the resulting text doesn t make too much sense However it can string together some phrases that make sense This can be improved by implementing a model to produce text on a word level or using a larger dataset You could also look into using technologies like GPT which has billions of parameters and produces extremely human like text 2021-12-26 11:02:58
ニュース BBC News - Home Desmond Tutu: UK tributes paid to archbishop https://www.bbc.co.uk/news/uk-59794470?at_medium=RSS&at_campaign=KARANGA archbishop 2021-12-26 11:22:54
LifeHuck ライフハッカー[日本版] 今年の年末年始は、あえて「何もしない」贅沢を感じよう https://www.lifehacker.jp/2021/12/248076-matome-stress-time-to-do-nothing.html 年末年始 2021-12-26 21:00:00
北海道 北海道新聞 レバンガ2連敗 A東京に82―95 https://www.hokkaido-np.co.jp/article/627777/ 連敗 2021-12-26 20:18:06
北海道 北海道新聞 「Go To」食事券、販売低調 38万冊残る 飲食店支援策 道や道商連、利用促進へてこ入れ https://www.hokkaido-np.co.jp/article/627776/ 新型コロナウイルス 2021-12-26 20:08:00
北海道 北海道新聞 柔道全日本、太田彪雅が初優勝 羽賀を撃破、影浦振るわず https://www.hokkaido-np.co.jp/article/627775/ 全日本選手権 2021-12-26 20:07:00
北海道 北海道新聞 国内感染263人、死者なし 新型コロナ、重症者38人に増 https://www.hokkaido-np.co.jp/article/627774/ 新型コロナウイルス 2021-12-26 20:02:00

コメント

このブログの人気の投稿

投稿時間:2021-06-17 22:08:45 RSSフィード2021-06-17 22:00 分まとめ(2089件)

投稿時間:2021-06-20 02:06:12 RSSフィード2021-06-20 02:00 分まとめ(3871件)

投稿時間:2021-06-17 05:05:34 RSSフィード2021-06-17 05:00 分まとめ(1274件)