IT |
気になる、記になる… |
MRヘッドセット関連の発表は確実か − Appleが「WWDC23」にAR/VR関連の専門家を招待 |
https://taisy0.com/2023/05/24/172144.html
|
apple |
2023-05-24 12:37:04 |
IT |
気になる、記になる… |
Tapbots、Mastodonのクライアントアプリ「Ivory」のMac版を公開 |
https://taisy0.com/2023/05/24/172141.html
|
ivory |
2023-05-24 12:16:49 |
IT |
ITmedia 総合記事一覧 |
[ITmedia News] TBSアナの謝罪で話題、YouTubeの「コンテンツID」って? |
https://www.itmedia.co.jp/news/articles/2305/24/news209.html
|
itmedianewstbs |
2023-05-24 21:12:00 |
IT |
ITmedia 総合記事一覧 |
[ITmedia News] 昭文社、昭和・平成の都市地図を電子書籍で復刻した「MAPPLEアーカイブズ」 |
https://www.itmedia.co.jp/news/articles/2305/24/news208.html
|
amazon |
2023-05-24 21:10:00 |
AWS |
AWS Database Blog |
Validate database objects after migrating from IBM Db2 z/OS to Amazon RDS for PostgreSQL or Amazon Aurora PostgreSQL |
https://aws.amazon.com/blogs/database/validate-database-objects-after-migrating-from-ibm-db2-z-os-to-amazon-rds-for-postgresql-or-amazon-aurora-postgresql/
|
Validate database objects after migrating from IBM Db z OS to Amazon RDS for PostgreSQL or Amazon Aurora PostgreSQLCustomers are modernizing their mission critical legacy on premises IBM Db for z OS databases to Amazon Relational Database Service Amazon RDS for PostgreSQL or Amazon Aurora PostgreSQL Compatible Edition for its scalability performance agility and availability You can use the AWS Schema Conversion Tool AWS SCT to simplify the schema conversion from Db for z OS to Amazon RDS … |
2023-05-24 12:49:55 |
python |
Pythonタグが付けられた新着投稿 - Qiita |
AirflowからDataformにdata_interval_endなどのcontext変数を渡す方法 |
https://qiita.com/aibazhang/items/5fcff1e4de5b15637bd6
|
datafor |
2023-05-24 21:05:17 |
Ruby |
Rubyタグが付けられた新着投稿 - Qiita |
GitHub Copilotを使ってみた感想 |
https://qiita.com/api_17/items/94086b4cb59d5283deea
|
githu |
2023-05-24 21:49:47 |
AWS |
AWSタグが付けられた新着投稿 - Qiita |
[AWS Q&A 365][SNS]AWSのよくある問題の毎日5選 #65 |
https://qiita.com/shinonome_taku/items/d0c10916dcf6a95737fa
|
amazonsns |
2023-05-24 21:53:24 |
AWS |
AWSタグが付けられた新着投稿 - Qiita |
[AWS Q&A 365][SNS]Daily Five Common Questions #65 |
https://qiita.com/shinonome_taku/items/3eaf0a2bff55bb71d26a
|
amazon |
2023-05-24 21:51:58 |
AWS |
AWSタグが付けられた新着投稿 - Qiita |
ECS Execで直接コンテナに入ってデバッグする方法 |
https://qiita.com/okdyy75/items/2ed9643c8ab5867dd415
|
ecstaskexec |
2023-05-24 21:50:36 |
Docker |
dockerタグが付けられた新着投稿 - Qiita |
TerraformでECRのリポジトリを作成しdocker build & pushする |
https://qiita.com/suzuki-navi/items/613311d1a31d0306be0d
|
dockerbuildamppush |
2023-05-24 21:38:39 |
Docker |
dockerタグが付けられた新着投稿 - Qiita |
DockerコンテナでMysqlにログイン出来ません |
https://qiita.com/kibwwen/items/9ee328dea3e4c5d5d6f2
|
docker |
2023-05-24 21:35:32 |
Ruby |
Railsタグが付けられた新着投稿 - Qiita |
GitHub Copilotを使ってみた感想 |
https://qiita.com/api_17/items/94086b4cb59d5283deea
|
githu |
2023-05-24 21:49:47 |
技術ブログ |
Developers.IO |
[アップデート] Amazon QuickSight でアセットのインポート・エクスポートジョブを実行することが出来るようになりました |
https://dev.classmethod.jp/articles/quicksight-inport-export-job/
|
amazonquicksight |
2023-05-24 12:00:54 |
海外TECH |
DEV Community |
Deep Neural Network from Scratch in Rust 🦀- Part 4- Loss Function and Back Propagation |
https://dev.to/akshayballal/deep-neural-network-from-scratch-in-rust-part-4-loss-function-and-back-propagation-133a
|
Deep Neural Network from Scratch in Rust Part Loss Function and Back PropagationAfter Forward Propagation we need to define a loss function to calculate how wrong our model is at this moment For a simple binary classification problem the loss function is given as below Cost J w b m Y log A L Y log A L Cost J w b frac m hat Y log A L hat Y log A L Cost J w b m Y log A L Y log A L where mmm ⇾number of training examples Y hat Y Y ⇾True Training Labels A L A L A L ⇾Predicted Labels from forward propagationThe purpose of the loss function is to measure the discrepancy between the predicted labels and the true labels By minimizing this loss we aim to make our model s predictions as close as possible to the ground truth To train the model and minimize the loss we employ a technique called backward propagation This technique calculates the gradients of the cost function with respect to the weights and biases which indicates the direction and magnitude of adjustments required for each parameter The gradient computations are performed using the following equations for each layer dZ l dA l ∗g′ Z l dZ l dA l g Z l dZ l dA l ∗g′ Z l dW l mdZ l A l TdW l frac m dZ l A l T dW l mdZ l A l T db l mΣaxis dZ l db l frac m Sigma axis dZ l db l mΣaxis dZ l dA l W l TdZ l dA l W l T dZ l dA l W l TdZ l If you have some background in calculus you can see the derivations for these equations here We use the simple chain rule of derivatives to find each gradient Once we have calculated the gradients we can adjust the weights and biases to minimize the loss The following equations are used for updating the parameters using a learning rate α alphaα W l W l α×dW l W l W l alpha times dW l W l W l α×dW l b l b l α×db l b l b l alpha times db l b l b l α×db l These equations update the weights and biases of each layer based on their respective gradients By iteratively performing the forward and backward passes and updating the parameters using the gradients we allow the model to learn and improve its performance over time This image shows how data flows between the backward and the forward pass Here you can see why we stored the linear cache and activation cache in the previous We reuse those values in the backward pass Let s get coding now The git repository for all the code until this part is provided in the link below Please refer to it in case you are stuck somewhere Cost FunctionTo calculate the cost function based on the above cost equation we need to first provide a log trait to Array lt f gt as you cannot directly take log of an array in rust We will do this by writing the following code in the start of lib rstrait Log fn log amp self gt Array lt f gt impl Log for Array lt f gt fn log amp self gt Array lt f gt self mapv x x log std f consts E Next in our impl DeepNeuralNetwork we will add a function to calculate the cost pub fn cost amp self al amp Array lt f gt y amp Array lt f gt gt f let m y shape as f let cost m y dot amp al clone reversed axes log y dot amp al reversed axes log return cost sum Here we pass in the last layer activations al and the true labels y to calculate the cost Backward Activationspub fn sigmoid prime z amp f gt f sigmoid z sigmoid z pub fn relu prime z amp f gt f match z gt true gt false gt pub fn sigmoid backward da amp Array lt f gt activation cache ActivationCache gt Array lt f gt da activation cache z mapv x sigmoid prime amp x pub fn relu backward da amp Array lt f gt activation cache ActivationCache gt Array lt f gt da activation cache z mapv x relu prime amp x The sigmoid prime function calculates the derivative of the sigmoid activation function It takes the input z and returns the derivative value which is computed as the sigmoid of z multiplied by minus the sigmoid of z The relu prime function computes the derivative of the ReLU activation function It takes the input z and returns if z is greater than and otherwise The sigmoid backward function calculates the backward propagation for the sigmoid activation function It takes the derivative of the cost function with respect to the activation da and the activation cache activation cache It performs an element wise multiplication between da and the derivative of the sigmoid function applied to the values in the activation cache activation cache z The relu backward function computes the backward propagation for the ReLU activation function It takes the derivative of the cost function with respect to the activation da and the activation cache activation cache It performs an element wise multiplication between da and the derivative of the ReLU function applied to the values in the activation cache activation cache z Linear Backwardpub fn linear backward dz amp Array lt f gt linear cache LinearCache gt Array lt f gt Array lt f gt Array lt f gt let a prev w b linear cache a linear cache w linear cache b let m a prev shape as f let dw m dz dot amp a prev reversed axes let db vec m dz sum axis Axis to vec let db Array from shape vec db vec len db vec unwrap let da prev w reversed axes dot dz da prev dw db The linear backward function calculates the backward propagation for the linear component of a layer It takes the gradient of the cost function with respect to the linear output dz and the linear cache linear cache It returns the gradients with respect to the previous layer s activation da prev the weights dw and the biases db The function first extracts the previous layer s activation a prev the weight matrix w and the bias matrix b from the linear cache It computes the number of training examples m by accessing the shape of a prev and dividing the number of examples by m The function then calculates the gradient of the weights dw using the dot product between dz and the transposed a prev scaled by m It computes the gradient of the biases db by summing the elements of dz along Axis and scaling the result by m Finally it computes the gradient of the previous layer s activation da prev by performing the dot product between the transposed w and dz The function returns da prev dw and db Backward Propagationimpl DeepNeuralNetwork pub fn initialize parameters amp self gt HashMap lt String Array lt f gt gt same as last part pub fn forward amp self x amp Array lt f gt parameters amp HashMap lt String Array lt f gt gt gt Array lt f gt HashMap lt String LinearCache ActivationCache gt same as last part pub fn backward amp self al amp Array lt f gt y amp Array lt f gt caches HashMap lt String LinearCache ActivationCache gt gt HashMap lt String Array lt f gt gt let mut grads HashMap new let num of layers self layers len let dal y al y al let current cache caches amp num of layers to string clone let mut da prev mut dw mut db linear backward activation amp dal current cache sigmoid let weight string dW amp num of layers to string join to string let bias string db amp num of layers to string join to string let activation string dA amp num of layers to string join to string grads insert weight string dw grads insert bias string db grads insert activation string da prev clone for l in num of layers rev let current cache caches amp l to string clone da prev dw db linear backward activation amp da prev current cache relu let weight string dW amp l to string join to string let bias string db amp l to string join to string let activation string dA amp l to string join to string grads insert weight string dw grads insert bias string db grads insert activation string da prev clone grads The backward method in the DeepNeuralNetwork struct performs the backward propagation algorithm to calculate the gradients of the cost function with respect to the parameters weights and biases of each layer The method takes the final activation al obtained from the forward propagation the true labels y and the caches containing the linear and activation values for each layer First it initializes an empty HashMap called grads to store the gradients It computes the initial derivative of the cost function with respect to al using the provided formula Then starting from the last layer output layer it retrieves the cache for the current layer and calls the linear backward activation function to calculate the gradients of the cost function with respect to the parameters of that layer The activation function used is sigmoid for the last layer The computed gradients for weights biases and activation are stored in the grads map Next the method iterates over the remaining layers in reverse order For each layer it retrieves the cache calls the linear backward activation function to calculate the gradients and stores them in the grads map Finally the method returns the grads map containing the gradients of the cost function with respect to each parameter of the neural network This completes the backward propagation step where the gradients of the cost function are computed with respect to the weights biases and activations of each layer These gradients will be used in the optimization step to update the parameters and minimize the cost Update ParametersLet us now update the parameters using the gradients that we calculated pub fn update parameters amp self params amp HashMap lt String Array lt f gt gt grads HashMap lt String Array lt f gt gt m f learning rate f gt HashMap lt String Array lt f gt gt let mut parameters params clone let num of layers self layer dims len for l in num of layers let weight string grad dW amp l to string join to string let bias string grad db amp l to string join to string let weight string W amp l to string join to string let bias string b amp l to string join to string parameters get mut amp weight string unwrap parameters amp weight string clone learning rate grads amp weight string grad clone self lambda m parameters amp weight string clone parameters get mut amp bias string unwrap parameters amp bias string clone learning rate grads amp bias string grad clone parameters In this code we go through each layer and update the parameters in the HashMap for each layer by using the HashMap of gradients in that layer This will return us the updated parameters That s all for this part I know this was a little involved but this is part is the heart of a deep neural network Here are some resources that can help you understand the algorithm more visually An Overview of the Back Propagation Algorithm t sCalculus Behind the Back Propagation Algorithm In the next and final part of this series we will run our training loop and test out our model on some cat imagesGitHub Repository My Website TwitterLinkedIn |
2023-05-24 12:28:35 |
海外TECH |
Engadget |
Samsung Galaxy Watch 6 leak suggests the rotating bezel will return |
https://www.engadget.com/samsung-galaxy-watch-6-leak-suggests-the-rotating-bezel-will-return-124637657.html?src=rss
|
Samsung Galaxy Watch leak suggests the rotating bezel will returnA favorite Samsung Galaxy Watch feature might just be making a comeback MySmartPrice nbsp has shared leaked renders procured by tipster OnLeaks that appear to show the full Samsung Galaxy Watch Classic design ーand it includes a physical rotating bezel Samsung notably removed the physical dial from the Galaxy Watch which utilizes a touch bezel that requires users to swipe their finger at the edge of the screen to change between apps or faces nbsp The Galaxy Watch had a physical rotating bezel for easy twisting ーwithout necessarily needing to double check the placement of your finger Judging by the leak the bezel on the upcoming model will be slightly thinner compared to those earlier versions and reports suggest it will border a inch Super AMOLED display with a x resolution nbsp SamMobile also suggests that the Galaxy Watch Classic will be powered by the Exynos W chip giving it around a percent boost compared to the Exynos W found in Galaxy Watch and Galaxy Watch models It will likely be a couple of months before the return of the physical bezel is fully confirmed Samsung is expected to announce the Galaxy Watch alongside the Galaxy Z Flip Galaxy Z Fold and Galaxy Tab S at its Unpacked event in the coming months This article originally appeared on Engadget at |
2023-05-24 12:46:37 |
海外TECH |
Engadget |
The Fujifilm X-S20 puts vlogging right on it its dial |
https://www.engadget.com/the-fujifilm-x-s20-puts-vlogging-right-on-it-its-dial-120841586.html?src=rss
|
The Fujifilm X S puts vlogging right on it its dialFujifilm is trying to beat Sony at its own game with the launch of the megapixel X S a content creation oriented camera Though it has a similar body and the same sensor as its predecessor the X S it offers some major improvements in terms of video quality and more At the same time it s considerably more expensive than the X S was at launch nbsp quot X S is truly a dream camera for any content creator looking to take their photos and videos creation to the next level but especially for the ones that are documenting their lives traveling the world or streaming their stories online said Fujifilm s Lisa Baxt essentially describing the camera s market and purpose nbsp Though it has the same last generation megapixel X Trans sensor as the X S it uses the company s new X Processor That allowed Fujifilm to install its latest deep learning AI autofocus technology that boosts speeds and allows the camera to detect animals birds cars motorcycles bicycles trains insects and drones much like the higher end X H and X T Plus it can detect all of those automatically so the user doesn t need to pick a subject before shooting nbsp FujifilmThat also boosted the camera s video powers considerably Where the X S was limited to K p bit video the X S can shoot K bit open gate video that can be cropped into any horizontal or vertical format you want It can also handle DCI K at fps and super slow mo p video In addition it supports F Log with up to plus stops of dynamic range DR where the X S was limited to F Log with a stop less DR It offers a much higher Mbps bit rate thanks to support for faster UHS II cards though there s still only a single card slot You can also record bit Apple ProRes and Blackmagic RAW video at K p and K p externally either to Atomos or Blackmagic recorders Finally Fujifilm is offering an optional external cooling fan that allows for K video recording for up to minutes compared to minutes without the fan nbsp Fujifilm flattered Sony by imitation with its dedicated quot Vlog quot function on the mode dial This new setting gives you direct access to a vlogging touch menu that offers functions like product priority focus mode background defocus high speed recording face eye detection and more Much like Sony s V series models product priority mode disables face eye detection so the camera will focus on a product placed in front of it while background defocus opens the lens aperture as wide as possible for more background blur nbsp FujifilmAlso new is UVC UAC support that lets the camera work directly as a webcam by just plugging it into your PC You can also stream K p video live online using OBS studio For photography the X S can fire bursts at up to frame per second in mechanical shutter mode or fps in electronic mode The buffer supports over JPEG or compressed RAW images in mechanical mode ーconsiderably more than before However it s limited to frames for uncompressed RAW images about double the X S In electronic mode the buffer can handle compressed RAW images before filling or uncompressed RAW shots nbsp As before it comes with five axis in body stabilization though Fujifilm has boosted the power from six stops to seven with supported lenses It also has a fully articulating inch display with resolution boosted to million dots up from million The OLED electronic viewfinder has million dots of resolution and a fps refresh nbsp It retains much the same body design with a slightly larger grip and gram weight that s a touch heavier ーbut it s still pretty light for such a powerful camera Other features include microphone headphone and HDMI micro ports and yes the pop up flash is back nbsp The X S is priced at body only which is considerably more than the launch price of the X S You can also grab it in a kit with XC mm f lens for or with the XF mm f lens for Shipping starts on June th nbsp Along with the camera Fujifilm unveiled the XApp designed to control X and GFX series camera for remote shooting file transfers and more The company said it quot listened carefully to user feedback quot when developing the app so here s hoping it s a large step up from the previous dreadful app Fujifilm also unveiled the ultra wide angle XFmmF lens shipping on or around June th nbsp FujifilmThis article originally appeared on Engadget at |
2023-05-24 12:08:41 |
ニュース |
BBC News - Home |
Teens killed in crash were not chased - police commissioner |
https://www.bbc.co.uk/news/uk-wales-65692972?at_medium=RSS&at_campaign=KARANGA
|
michael |
2023-05-24 12:46:36 |
ニュース |
BBC News - Home |
Belgorod: Russia's Shoigu vows 'harsh response' after incursion into Russia |
https://www.bbc.co.uk/news/world-europe-65696731?at_medium=RSS&at_campaign=KARANGA
|
belgorod |
2023-05-24 12:06:13 |
ニュース |
BBC News - Home |
95-year-old woman Tasered by police in Australia dies |
https://www.bbc.co.uk/news/world-australia-65696475?at_medium=RSS&at_campaign=KARANGA
|
australia |
2023-05-24 12:32:51 |
ニュース |
BBC News - Home |
Man jailed for rape and murder of Jill Barclay in Aberdeen |
https://www.bbc.co.uk/news/uk-scotland-north-east-orkney-shetland-65495249?at_medium=RSS&at_campaign=KARANGA
|
september |
2023-05-24 12:21:34 |
ニュース |
BBC News - Home |
Did a lack of opportunity contribute to Cardiff riot? |
https://www.bbc.co.uk/news/uk-wales-65693977?at_medium=RSS&at_campaign=KARANGA
|
people |
2023-05-24 12:02:23 |
IT |
週刊アスキー |
シューズや腰に着けて、ランニングの詳細データが取れるタグ「HUAWEI S-TAG」 |
https://weekly.ascii.jp/elem/000/004/138/4138110/
|
huaweistag |
2023-05-24 21:55:00 |
海外TECH |
reddit |
Pronunciation of Japanese people/place names while speaking English |
https://www.reddit.com/r/japanlife/comments/13qk70f/pronunciation_of_japanese_peopleplace_names_while/
|
Pronunciation of Japanese people place names while speaking EnglishRecently one of my foreigner friends mentioned that he hates when other foreigners pronounce Japanese place names in a Japanese manner in the middle of an English conversation What do you guys think Some examples he gave were “Roppongi being pronounced with the pause the “っ between to “ro and the “pon the first syllable of Hiroshima being pronounced with a proper Japaneseひtype sound instead of an English rendering instead of pronouncing it as “he as in opposite of she and Tokyo being pronounced as toukyou instead of the English rendering in which the “o sounds are not elongated He thinks it s fine if Japanese people or those born and raised here do it but when foreigners who grew up entirely abroad do it it feels like they re trying to show off What do you guys think Btw I totally disagreed with my friend on this submitted by u AlwaysLearn to r japanlife link comments |
2023-05-24 12:27:30 |
コメント
コメントを投稿