python |
Pythonタグが付けられた新着投稿 - Qiita |
macOS Big Surでgrpcioのインストールに失敗したらSYSTEM_VERSION_COMPAT=1をつける |
https://qiita.com/que9/items/02af84f25d040ea33655
|
macOSBigSurでgrpcioのインストールに失敗したらSYSTEMVERSIONCOMPATをつけるtldrSYSTEMVERSIONCOMPATpipinstallgrpcio問題が生じた環境macOSBigSurPythonpipgrpcio経緯Pythonのパッケージ解決をしていたらgrpcioだけインストールに失敗した。 |
2021-05-03 18:56:04 |
python |
Pythonタグが付けられた新着投稿 - Qiita |
picoCTF Practice Compress and Attack Writeup |
https://qiita.com/housu_jp/items/e003bc5dace6189bd554
|
picoCTFPracticeCompressandAttackWriteuppythonでソケット通信を組み立て総当たり攻撃で解く問題。 |
2021-05-03 18:43:10 |
python |
Pythonタグが付けられた新着投稿 - Qiita |
DockerからTensorBoardXをつかう |
https://qiita.com/iwankoTG/items/f1e3e4d13cda8e8eb1ad
|
TensorBoardXも使えるようにdocker環境を修正tensorboardXでは、ニューラルネットワークの共通フォーマットであるONNXOpenNeuralNetworkExchange形式を可視化するので、まずはPytorchで作成されたモデルをONNX形式に変換する必要があります。 |
2021-05-03 18:22:12 |
python |
Pythonタグが付けられた新着投稿 - Qiita |
markdown で書いたテキストを Medium に投稿する を iPad だけで完結させた話 |
https://qiita.com/tick-taku/items/a6e633c344161dba28f6
|
お疲れ様でしたこれでiPadからマークダウンで書いたテキストをそのままMediumに投稿できる様になりましたオシャなカフェでドヤり放題です実装についてinitも全てPythonにしているのは、一応PCからでも使える様にinitshを置いていますが、これがashellで実行できなかったからです。 |
2021-05-03 18:13:09 |
python |
Pythonタグが付けられた新着投稿 - Qiita |
セカント法(割線法) |
https://qiita.com/roadto93ds/items/6dbbf4fe6e3b83e44498
|
xnewxoldfracfxoldfxoldprimefxoldprimeが分からない・計算したくない時はセカント法で微分計算を差分近似できる。 |
2021-05-03 18:11:44 |
python |
Pythonタグが付けられた新着投稿 - Qiita |
TensorFlowのDatasetを知る |
https://qiita.com/typecprint/items/3d10e77e76e74db6e9e9
|
modelfitdsepochsstepsperepochちなみにrepeatを使わないdsを引き渡して、countbatchsize以上のstepsperepochを設定すると下記のようなエラーがでます。 |
2021-05-03 18:11:24 |
js |
JavaScriptタグが付けられた新着投稿 - Qiita |
Discord BOTを特定チャンネルで呼びかけると反応するようにするメモ |
https://qiita.com/n0bisuke/items/ba3ed0a43642a966f635
|
なんか前にも同じようなメモ書いた気もしてるこんな感じにユーザーが発言したチャンネルによって処理のハンドリングをさせたいってメモです。 |
2021-05-03 18:59:07 |
js |
JavaScriptタグが付けられた新着投稿 - Qiita |
MicrosoftDocsに日英両方のテキストを表示する |
https://qiita.com/ryuix/items/6d160fa2cf2189beb110
|
MicrosoftDocsに日英両方のテキストを表示する概要MicrosoftDocsはもうだいぶ前から英語で読むというトグルで気軽に日英切替できるようになりましたが、やはり日本語が怪しいと感じることがあります。 |
2021-05-03 18:05:24 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
『iTunes Connect』 MyApp が表示されない |
https://teratail.com/questions/336430?rss=all
|
Artistsのみリンク先に飛びます初めての申請で分からないことが多く、、教えていただけると助かります。 |
2021-05-03 18:55:23 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
Cloud functionsの.onCreateが実行されない |
https://teratail.com/questions/336429?rss=all
|
CloudfunctionsのonCreateが実行されない前提・実現したいことドキュメントが作成、更新された時にusersのサブコレクションpostsがCloudnbspfunctionsでルートのpostsコレクションにコピーされるようにしました。 |
2021-05-03 18:51:34 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
AWS(EC2)で再デプロイする際、unicornの起動時にエラーが発生してしまう。(with_friendly_errors) |
https://teratail.com/questions/336428?rss=all
|
AWSECで再デプロイする際、unicornの起動時にエラーが発生してしまう。 |
2021-05-03 18:49:14 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
split()でいくつに分割されたかを知りたい |
https://teratail.com/questions/336427?rss=all
|
splitでいくつに分割されたかを知りたい知りたいことsplitによって文字列がいくつに分かれるかで条件分岐をしたい具体ファイル添付分かりませんでした。 |
2021-05-03 18:44:45 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
AR.js+A-frameのWebARで、glb表示と共にサウンドを鳴らしたい。 |
https://teratail.com/questions/336426?rss=all
|
マーカーが表示されたらglb表示と共にサウンド再生したいです。 |
2021-05-03 18:35:28 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
CMakeによるpybind11を用いたC++のコンパイルがうまくできない |
https://teratail.com/questions/336425?rss=all
|
|
2021-05-03 18:27:48 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
Room TypeConverters について |
https://teratail.com/questions/336424?rss=all
|
以下のようなクラスをRoomに保存したいです。 |
2021-05-03 18:21:19 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
IFTTT WebhooksからngrokにPOSTすると400 Bad requestになる |
https://teratail.com/questions/336423?rss=all
|
IFTTTWebhooksからngrokにPOSTするとBadrequestになるGooglenbsphomenbspnotifierを利用するために下記のような構成を構築したいと考えています。 |
2021-05-03 18:09:41 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
ソースコードってどこにあるの? |
https://teratail.com/questions/336422?rss=all
|
ソース コード って どこ に ある の こんにちわ 。 |
2021-05-03 18:02:24 |
Ruby |
Rubyタグが付けられた新着投稿 - Qiita |
rails 自動整形ツールRubocop |
https://qiita.com/on97-nakamatsu-mayumi/items/776c6747319a7baa0642
|
「インデントが揃っていない」「余分な改行・スペースがある」などの指摘をRubyStyleGuideに基づいて行ってくれます。 |
2021-05-03 18:22:52 |
Ruby |
Rubyタグが付けられた新着投稿 - Qiita |
DoorkeeperのAuthorization Code Flowが導入されている状態からPKCEを導入する手順 |
https://qiita.com/nobuo_hirai/items/0fdd27d3a43161815da5
|
導入手順wikiに記載されているように以下のコマンドを実行します。 |
2021-05-03 18:12:57 |
AWS |
AWSタグが付けられた新着投稿 - Qiita |
[AWS & Postgres & Rails]EC2にRailsアプリの配置 |
https://qiita.com/Jackson123/items/e91eec2f76774717bdee
|
githubからデプロイしたいアプリのリモートレポジトリへ移動「code」を選択「SSH」を選択して、表示されているURLをコピーするlsコマンドでアプリ名が記載されているディレクトリが存在すれば、クローンは成功です。 |
2021-05-03 18:59:53 |
AWS |
AWSタグが付けられた新着投稿 - Qiita |
AWS勉強メモ |
https://qiita.com/Masataka_Sugi/items/5f314e8cf3d1416e8820
|
CloudFrontAmazonCloudFront世界中に個所以上にあるエッジローケーションを利用し、低いレイテンシーでコンテンツを配信できるCDNサービス・CloudFrontの特徴キャッシュによる低レイテンシー配信ユーザー近くからの低レイテンシー配信安全性の高いセキュリティ・セキュリティAWSCertificateManagerを利用することで通信データを保護することができるAWSShield、AWSWAFを利用することで攻撃からも保護することができるRouteAmazonRouteDNSサービス・Routeの特徴様々なルーティング機能シンプルルーティング単一のipアドレス情報を回答すすシンプルなルーティングレイテンシーベースのルーティングGeoDNSつのドメインに対して複数のDNSレコードを用意し地理的な場所を近くする加重ラウンドロビン複数値回答高可用性を実現するヘルスチェックとフェイルオーバールートドメインZoneApexのエイリアスレコードAWSデータベースサービスRDSAmazonRelationalDatabaseService・ポイントタイムリカバリーフォランザクションログが保存されていることによっていバックアップ期間内の任意の特定時間のインスタンスを起動できる・RDSのマルチAZ配置マルチAZ配置をonにすることで高可用性を担保することができ、データセンター間で数ミリ秒以内のレイテンシーでレプリケーションされる。 |
2021-05-03 18:01:55 |
Ruby |
Railsタグが付けられた新着投稿 - Qiita |
[AWS & Postgres & Rails]EC2にRailsアプリの配置 |
https://qiita.com/Jackson123/items/e91eec2f76774717bdee
|
githubからデプロイしたいアプリのリモートレポジトリへ移動「code」を選択「SSH」を選択して、表示されているURLをコピーするlsコマンドでアプリ名が記載されているディレクトリが存在すれば、クローンは成功です。 |
2021-05-03 18:59:53 |
Ruby |
Railsタグが付けられた新着投稿 - Qiita |
rails 自動整形ツールRubocop |
https://qiita.com/on97-nakamatsu-mayumi/items/776c6747319a7baa0642
|
「インデントが揃っていない」「余分な改行・スペースがある」などの指摘をRubyStyleGuideに基づいて行ってくれます。 |
2021-05-03 18:22:52 |
Ruby |
Railsタグが付けられた新着投稿 - Qiita |
DoorkeeperのAuthorization Code Flowが導入されている状態からPKCEを導入する手順 |
https://qiita.com/nobuo_hirai/items/0fdd27d3a43161815da5
|
導入手順wikiに記載されているように以下のコマンドを実行します。 |
2021-05-03 18:12:57 |
海外TECH |
DEV Community |
Automate backing up your Docker volumes |
https://dev.to/hendr_ik/automate-backing-up-your-docker-volumes-3gdk
|
Automate backing up your Docker volumesDocker is an ubiquitous tool for us while building Offen a fair and open source web analytics software It is foundational for our development setup but we also use it for deploying our own Offen instance to production One thing that we found missing was a simple and lightweight tool for taking and managing remote backups of Docker volumes This is why we wrote our own tool called offen docker volume backup which In this post I d like to introduce you to the tool and how to use it for automatically taking backups of the Docker volumes in your own setup Introduction to Docker volumesVolumes are Docker s way of managing persistent data As Docker containers themselves are ephemeral volumes can be mounted into the container s filesystem enabling you to persist data beyond the lifecycle of a container Volumes are commonly used for storing database data or similar For example this is how you would use a Docker volume to persist data for an Offen container docker volume create offen datadocker run d v offen data var opt offen offen offen latestIn the running container data stored in var opt offen will be persisted in the offen data volume and can be reused in other containers Using offen docker volume backup offen docker volume backup is designed to run sidecared next to an application container and periodically take backups of volumes to any S compatible storage i e AWS S itself or storages like MinIO or Ceph It can run on any schedule you wish and it can also take care of rotating away old backups after a configured retention period If needed it can temporarily stop and restart your running containers to ensure backup integrity Using alpine as the base image and using the MinIO client instead of AWS CLI for uploading files to the remote storage keeps the image small and lightweight Defining the sidecar containerThe easiest way of managing such a setup is using docker compose A compose file that backs up its volumes would look something like this version services offen image offen offen latest volumes db var opt offen labels docker volume backup stop during backup true backup image offen docker volume backup v Ideally those values should go into an env file or Docker secrets as they contain credentials It s easier to spell them out here in the context of this tutorial though environment A backup is taken each day at AM BACKUP CRON EXPRESSION Backups are stored with a timestamp appended BACKUP FILENAME offen db Y m dT H M S tar gz Backups older than days will be pruned If this value is not given backup will be kept forever BACKUP RETENTION DAYS Credentials for your storage backend AWS ACCESS KEY ID lt YOUR ACCESS KEY gt AWS SECRET ACCESS KEY lt YOUR SECRET KEY gt AWS S BUCKET NAME my backups If given backups are encrypted using GPG GPG PASSPHRASE lt SOME KEY gt volumes This allows the tool to stop and restart all containers labeled as docker volume backup stop during backup var run docker sock var run docker sock ro All volumes mounted to backup lt some name gt will be backed up db backup offen db rovolumes db Of course you can also use the image using plain Docker commands docker volume create offen datadocker run d v offen data var opt offen l docker volume backup stop during backup true offen offen latestdocker run d v offen data backup offen db ro v var run docker sock var run docker sock ro env file backup env offen docker volume backup v Manually triggering a backupInstead of running the backups on a regular schedule you can also execute the command in a running container yourself docker exec lt container ref gt backup Restoring a volume from a backupTo recover from a backup download and untar the backup file and copy its contents back into the docker volume using a one off container created for just that purpose docker run d name backup restore v offen data backup restore alpinedocker cp lt location of your unpacked backup gt backup restore backup restoredocker stop backup restore amp amp docker rm backup restoreThe volume is now ready to use in other containers Alternatively you can use a one off volume created beforehand More informationDetailed documentation and the source code is available at the GitHub repository and at Docker Hub Source code is licensed under the Mozilla Public License Wrapping upKnowing you have remote backups around in case of unexpected infrastructure glitches helps moving forward with confidence and not too much worry I hope this article demonstrated that adding them to your Docker setup is only a matter of configuring an additional container and helps you get going with your backups so you can move forward with your product Written by Frederik Ring |
2021-05-03 09:44:58 |
海外TECH |
DEV Community |
Solution: Running Sum of 1d Array |
https://dev.to/seanpgallivan/solution-running-sum-of-1d-array-34na
|
Solution Running Sum of d ArrayThis is part of a series of Leetcode solution explanations index If you liked this solution or found it useful please like this post and or upvote my solution post on Leetcode s forums Leetcode Problem Easy Running Sum of d Array Description Jump to Solution Idea Code JavaScript Python Java C Given an array nums We define a running sum of an array as runningSum i sum nums …nums i Return the running sum of nums Examples Example Input nums Output Explanation Running sum is obtained as follows Example Input nums Output Explanation Running sum is obtained as follows Example Input nums Output Constraints lt nums length lt lt nums i lt Idea Jump to Problem Description Code JavaScript Python Java C While this is not a terribly challenging problem it s a good introduction to the concept of a prefix sum array Prefix sum arrays have many uses in more complex algorithms and can sometimes help reduce the time complexity of a advanced solution by an order of magnitude In a prefix sum array we will create a duplicate array which contains the running sum of the elements to i of our original array nums for each index i of our prefix sum array ans Note We can lower the space complexity by using an in place approach with nums directly and mutating it into its own prefix sum array if there is no compelling reason to avoid modifying a function argument Since we ll need to build on a previous running total we should start our iteration at i and copy over the first element from nums to ans Then we just iterate through nums and add each element nums i to the previous running total ans i to create the new running total ans i When we re done we can return ans Time Complexity O N where N is the length of numsSpace Complexity O N for our running sum arrayor O with an in place approach Javascript Code Jump to Problem Description Solution Idea var runningSum function nums let ans new Array nums length ans nums for let i i lt nums length i ans i ans i nums i return ans Python Code Jump to Problem Description Solution Idea class Solution def runningSum self nums List int gt List int ans len nums ans nums for i in range len nums ans i ans i nums i return ans Java Code Jump to Problem Description Solution Idea class Solution public int runningSum int nums int ans new int nums length ans nums for int i i lt nums length i ans i ans i nums i return ans C Code Jump to Problem Description Solution Idea class Solution public vector lt int gt runningSum vector lt int gt amp nums vector lt int gt ans nums size ans nums for int i i lt nums size i ans i ans i nums i return ans |
2021-05-03 09:32:36 |
海外TECH |
DEV Community |
Integrating webhook notifications with Adyen Checkout |
https://dev.to/adyen/integrating-webhook-notifications-with-adyen-checkout-39dm
|
Integrating webhook notifications with Adyen CheckoutBy Andrew Wong Developer Advocate IntroWhen it comes to processing payments Adyen provides a webhook to help customers our platform s merchants build a complete online checkout integration In this blog we ll check out how webhooks work and how you can implement and test webhook notifications in your checkout integration What is a webhook In short using a webhook is a great way for your applications to receive events or notifications from a service To illustrate think about your favorite local bookstore Before you ever make a trip out to them you d prefer to know whether or not a certain book is in stock One way to find out is to call the bookstore yourself and ask them directly While this certainly works one drawback is that you may have to make repeated calls to find out the availability of your favorite book This wouldn t be a great use of your nor the store s time and resources What if the book tends to sell out quickly Having information about the book s availability the moment it s restocked could greatly increase your chances of coming home with a new novel In the context of software development calling the bookstore is like making an API call and receiving a response For both situations it s still a manual process and you receive information only when you ask for it However what if the bookstore calls you when they finally have the book in stock This would save time for both parties as information about any in stock status changes would only be sent whenever there s new information to share in the first place This is the philosophy behind a webhook a way to receive new information or data proactively without having to repeatedly poll a service Now how does this look like in real world applications Real world application of webhooksImagine the “read receipts in your favorite messaging app For the most part it s made possible by taking advantage of an event webhook That is whenever your sent message is opened by the other application an HTTP POST request is made to a specified URL In turn your application is notified with data that the message has been read and can then run business logic for rendering the read receipt There s no need for your application to continuously check if your message has been read as it ll be automatically notified when the read status has been updated All in all using a webhook is ideal whenever you need to run some business logic in your application based on an event happening in another application Adyen notification webhooksWebhooks are essential in the financial services space due to the asynchronous information flow of banks payment methods and other financial systems Adyen provides a webhook to communicate this crucial information to our customers as soon it is available This information includes Payment status updatesEvents that are not triggered by a request from the customer side This includes events such as a shopper initiated chargeback Requests that are processed asynchronously For example in the case of many local payment methods such as iDEAL the outcome of the payment may take several hours to confirm We send the customer a notification the second new information becomes available Integrating the webhookThere are two main steps to configure Adyen s notification webhooks Creating an endpoint on your server This includes acknowledging the notification sent by Adyen verifying its signature storing the notification and applying your business logic Setting up notifications in Customer Area Let s check out how to set up notifications in a test online Checkout application built in React and Express We ll also be using ngrok which helps provide a public URL to tunnel into localhost as we test our integration Before writing our first line of code be sure to clone and install the test application You can find instructions for getting everything up and running including generating the required account keys in the test application s README Creating an endpoint on your serverOnce your project is up and running i e accessible on port the first step is to expose and configure an endpoint on the server In the Express server file we ll create an HTTP POST route to api webhook notification app post api webhook notification async req res gt res send accepted try catch err As notifications come in we ll first acknowledge the notifications with an accepted response This helps ensure that our application server continues to properly accept notifications res send accepted Next we ll need to retrieve the array of notification request items from the POST body which looks something like this NotificationRequestItem additionalData eventCode AUTHORISATION success true eventDate T merchantAccountCode YOUR MERCHANT ACCOUNT pspReference merchantReference YOUR REFERENCE amount value currency EUR Note that some fields included in the NotificationRequestItem object depend on the type of event that triggered the notification For example notifications triggered by a request to refund will be different than a notification triggered by a chargeback request Next we need to iterate over the list of notification request items and process them based on the type of event the notification item is associated with i e its eventCode As we iterate through the list however we ll also want to verify its HMAC hash based message authentication code signatures This helps protect the application server from unauthorised notifications such as any data that may have been modified during transmission Since our application leverages the Adyen Node js server side library to interact with Adyen s API we can conveniently import and instantiate the built in HMAC validator class to do just that const hmacValidator require adyen api library const validator new hmacValidator Next we ll begin processing each notification request item const notificationRequestItems req body notificationItems notificationRequestItems forEach item gt if validator validateHMAC item NotificationRequestItem process env ADYEN HMAC KEY const eventCode item NotificationRequestItem eventCode Your business logic here i e process the notification based on the eventCode else Non valid NotificationRequest console log Non valid NotificationRequest As we iterate through the list of notification request items we use our instance of the HMAC validator to validate each notification request item passing in the object itself as well as our HMAC key referenced above in an environment variable This secret key enables HMAC signed notifications and we ll generate it shortly in the next steps of this blog If the notification request item is valid we run some business logic based on the event code For example the notification may detail a successful payment or inform you that a new report for accounting purposes is available Your application can then take any action it needs to based on this new information All together our endpoint should look something like the following app post api webhook notifications async req res gt res send accepted try const validator new hmacValidator const notificationRequestItems req body notificationItems Be sure to store your notifications in a database not shown here notificationRequestItems forEach item gt if validator validateHMAC item NotificationRequestItem process env ADYEN HMAC KEY const eventCode item NotificationRequestItem eventCode Your business logic here i e process the notification based on the eventCode An example for a payment notification if eventCode AUTHORISATION This notification is for a payment if item NotificationRequestItem success true Payment was successful else Payment was refused else console log Non valid NotificationRequest catch err console error Error err message error code err errorCode res status err statusCode json err message At this point our server endpoint has been exposed and configured The issue however is that our application is currently being served on localhost To set up the webhook we ll need a public URL to the application This is where we leverage the ngrok tool which helps us tunnel i e forward to our local application from a publicly accessible URL Follow the instructions to install ngrok and connect your account then run ngrok from the command line on port ngrok http If everything s running correctly you should see something like this output in the terminal lt Terminal output ngrok by inconshreveable Ctrl C to quit Session Status onlineAccount Sample User Plan Free Version Region United States us Web Interface Forwarding gt http localhost http localhost Forwarding gt http localhost http localhost Connections ttl opn rt rt p p gt Note the public URL as referenced by “Forwarding in the output We ll use this value as we head into the Adyen Customer Area for webhook configuration Setting up notifications in Customer AreaThe next step is to provide your server s details as well as customize the information you want to receive in notifications To get started first log in to Customer Area On the top right navigate to Account gt Server communication Then next to Standard Notification click Add Under Transport supply the following information URL Your server s public URL This is the “Forwarding value in the ngrok terminal output above SSL For a test environment use NO SSL HTTP TEST only Method JSONActive ✓Finally under Additional Settings select the information you want to receive in notifications For example adding the acquirer result allows you to receive standardized mapping of the acquirer results for AVS and CVC checks For more information on shopper and transaction information check out Additional settings in our documentation The final step is to select Generate new HMAC key This allows you to receive HMAC signed notifications which will verify the integrity of notifications using these HMAC signatures After generating the HMAC key assign its value to an environment variable i e process env ADYEN HMAC KEY as referenced in the above server code At this point the server and server communication settings have been fully set Before we jump into testing be sure to select Save Configuration at the bottom of the page Testing the webhookTo make sure that we can receive notifications we ll first need to test them While still on the current server settings page scroll to the bottom and toggle which notifications you want to test Then select Test Configuration If everything was set up correctly you should see something like the following at the top of the page At this point it s a wrap Feel free to further configure your server communication and continue to apply any business logic in your server code as you see fit Getting startedImplementing Adyen notification webhooks is crucial for a successful integration with Adyen To get started feel free to check out the quick start in documentation To see everything in action first make sure you have the proper keys and credentials from a test account then clone one of our example integrations As always you are always welcome to visit our forum or reach out directly if you have any comments or feedback along the way Technical careers at AdyenWe are on the lookout for talented engineers and technical people to help us build the infrastructure of global commerce Check out developer vacancies Developer newsletterGet updated on new blog posts and other developer news Subscribe nowOriginally published at on March |
2021-05-03 09:19:28 |
海外TECH |
DEV Community |
SvelteKit Sitemap |
https://dev.to/rbt/sveltekit-sitemap-b00
|
SvelteKit SitemapSvelteKit came out in public beta a little over a month ago and I ve finally gotten around to trying it out I ll write up my thoughts elsewhere but I ve moved r bt com over to SvelteKit and replaced my Notion CMS with markdown The reason being I want to be able to use custom components Anyway one problem I had was creating a sitemap xml for my static build SvelteKit dosn t support creating sitemaps automatically although it might in the future Instead I made a post build step Some notes about this I m using Node v if you use an earlier version you might need to change import to requireI use sveltejs adapter static to build a static site which is stored in build The Script Install the dependenciesnpm install D fast glob xmlbuilder Create a new file generate sitemap xml and add the following import fs from fs import fg from fast glob import create from xmlbuilder import pkg from package json const getUrl url gt const trimmed url slice replace index html return pkg url trimmed async function createSitemap const sitemap create version ele urlset xmlns const pages await fg build html pages forEach page gt const url sitemap ele url url ele loc txt getUrl page url ele changefreq txt weekly const xml sitemap end prettyPrint true fs writeFileSync build sitemap xml xml createSitemap Update your package json url scripts postbuild node experimental json modules generate sitemap js The ExplainationTo make the sitemap we re going to build the site glob all the html files and write the xml back to the build directory Before starting install the dependeciesnpm install D fast glob xmlbuilderNow create a new file generate sitemap xmlFirst let s get the files we need import fg from fast glob async function createSitemap const pages await fg build html console log pages If you run this you should get an array with the paths of all your pages pages build index html build blog index html build about index html build learning index html Next we ll use xmlbuilder to create the xml objectsimport create from xmlbuilder const sitemap create version ele urlset xmlns and we just loop through the pages adding each as a url object with a loc and changefrequency to the sitemappages forEach page gt const url sitemap ele url url ele loc txt page url ele changefreq txt weekly Finally we turn the sitemap into a string and write it to a file using fs writeFileSyncimport fs from fs import fg from fast glob import create from xmlbuilder async function createSitemap const sitemap create version ele urlset xmlns const pages await fg build html console log pages pages forEach page gt const url sitemap ele url url ele loc txt page url ele changefreq txt weekly const xml sitemap end prettyPrint true fs writeFileSync build sitemap xml xml createSitemap Except we have a problem If you run this code node generate sitemap jsand go to build sitemap xml you ll see that the locs are something that looks like build learning why is it so hard to find a domain index htmlwhile we want it to be To fix this go to your package json and add url Then in generate sitemap js we ll import package json and append the url to the pages paths We ll also remove the first characters build and index htmlimport pkg from package json const getUrl url gt const trimmed url slice replace index html return pkg url trimmed Node js dosn t yet importing json files so need to run this script with the experimental json modules flagnode experimental json modules generate sitemap jsand you re sitemap should be generated and valid To get it to run whenever you build the site go back to package json and in scripts add scripts postbuild node experimental json modules generate sitemap js |
2021-05-03 09:13:09 |
海外TECH |
DEV Community |
Use depfu and Mergify to automatically merge dependency updates |
https://dev.to/boyum/use-depfu-and-mergify-to-automatically-merge-dependency-updates-1aid
|
Use depfu and Mergify to automatically merge dependency updatesOver the years I have accumulated quite a few free time projects that one after another become stale Security alerts keep rolling in and getting all projects up to date is exhausting and might feel overwhelming Let s automate this task depfuFor some time I have updated the projects manually however this became way too time consuming Enter depfu a free for open source projects service that keeps your project s dependencies up to date by proposing pull requests PRs whenever there s a new dependency version Renovate is a similar service and would work the same for the purpose of this tutorial Depfu has made my life much easier it automatically creates PRs and the only job left for me is to approve and merge the PR This is all well and good however with many projects even this process becomes tedious Let s try to automate this task even further MergifyMergify can merge PRs automatically and lets us define rules for when that should happen Together depfu and Mergify can automatically keep our dependencies updated Actual tutorial Step Create depfu and Mergify accountsBefore we can start configuring these tools we ll need to create one account in each service and give the services the required permissions Once this is done depfu will start creating dependency update PRs in the projects that were added in the depfu GUI Step Configure MergifyWe can configure Mergify in a mergify yml file placed in the root of our project Mergify has a great deal of example configurations which is very helpful when new to the tool This configuration is very powerful however our task is quite simple and doesn t need much writing pull request rules name Automatic merge for depfu pull requests conditions author depfu bot base main or master actions merge method mergeThat s actually all that we need We ensure that it was actually depfu that created the PR then check that the PR will be merged to the main branch Now every pull request created by depfu will be merged automatically Is automating this a good idea We should ask ourselves if we actually want dependency updates to be merged automatically They should be subject to review and should perhaps not be merged into the codebase uncritically This can be mitigated by adding automated tests and to run build scripts on every commit If required checks fail Mergify won t merge the PR Also Mergify has another trick up it s sleeve We can do a RegEx search on the PR s title This combines neatly with the fact that depfu adds a major minor or patch label to the end of the PR title We can filter out major and minor updates and our final Mergify config now looks like this pull request rules name Automatic merge for depfu pull requests conditions author depfu bot base main or master title patch actions merge method merge PRs in my repository must be reviewed before mergeOh do they now As said before Mergify will wait until no required checks fail and that includes required reviews No problem We can automate PR reviewal as well I ve found that Andrew Musgrave s automatic pull request review GitHub Action works great In our github workflows directory we create a new GitHub check Let s call the file automatic dependency review yml you can call the file whatever you want name Automatic dependency reviewon pull request jobs automate dependency review runs on ubuntu latest steps name Approve pull request if github event pull request user login depfu bot uses andrewmusgrave automatic pull request review with repo token secrets GITHUB TOKEN event APPROVE body Thank you depfu The file now reads like this For every pull request check that depfu created it then approve it Done Now with these new apps and actions dependency update pull requests will be created reviewed and merged automatically The GitHub Marketplace is filled with gems like these and I encourage you all to explore the list to make life easier and more automated |
2021-05-03 09:08:56 |
Apple |
AppleInsider - Frontpage News |
Apple to close Virginia's MacArthur Center store on May 14 |
https://appleinsider.com/articles/21/05/03/apple-to-close-virginias-macarthur-center-store-on-may-14?utm_medium=rss
|
Apple to close Virginia x s MacArthur Center store on May The Apple Store in MacArthur Center VA is closing down permanently and follows the shuttering of many prominent stores in the mall Apple MacArthur Center source Apple As previously reported Apple has been planning to close down its MacArthur Center store in Virginia The store s own webpage now confirms that the last day will be Friday May Read more |
2021-05-03 09:51:38 |
海外TECH |
Engadget |
Researchers detail three new Intel and AMD Spectre vulnerabilities |
https://www.engadget.com/three-new-intel-amd-spectre-vulnerabilities-092432930.html
|
Researchers detail three new Intel and AMD Spectre vulnerabilitiesSecurity researchers have discovered three new variants of Spectre vulnerabilities that affect Intel and AMD processors with micro op caches |
2021-05-03 09:24:32 |
ニュース |
@日本経済新聞 電子版 |
私の考える憲法 有識者に聞く
https://t.co/GITvKZQeGZ |
https://twitter.com/nikkei/statuses/1389147447197257728
|
憲法 |
2021-05-03 09:18:55 |
ニュース |
@日本経済新聞 電子版 |
福岡県全域で飲食店に時短要請 商業施設にも
https://t.co/dGYMmstvSI |
https://twitter.com/nikkei/statuses/1389143939110760453
|
商業施設 |
2021-05-03 09:04:58 |
海外ニュース |
Japan Times latest articles |
The downside of decision-making practices in Japan |
https://www.japantimes.co.jp/community/2021/05/03/general/downside-decision-making-practices-japan/
|
whole |
2021-05-03 20:00:17 |
海外ニュース |
Japan Times latest articles |
Moving house in a pandemic: Meeting the neighbors and readying your child for their ‘playground debut’ |
https://www.japantimes.co.jp/community/2021/05/03/voices/moving-house-pandemic-readying-child-playground-debut/
|
Moving house in a pandemic Meeting the neighbors and readying your child for their playground debut In the final entry of a three part series on moving house during the pandemic our writer masks up to meet the neighbors |
2021-05-03 19:30:10 |
海外ニュース |
Japan Times latest articles |
New frontier: The future of tourism |
https://www.japantimes.co.jp/life/2021/05/03/travel/future-tourism-japan-covid-19/
|
covid |
2021-05-03 18:30:47 |
ニュース |
BBC News - Home |
Line of Duty finale lands record ratings |
https://www.bbc.co.uk/news/entertainment-arts-56945425
|
drama |
2021-05-03 09:06:02 |
LifeHuck |
ライフハッカー[日本版] |
先行販売間もなく終了! 吉田カバン元デザイナーのブランド発4wayバッグ |
https://www.lifehacker.jp/2021/05/machi-ya-wingman-end.html
|
wingman |
2021-05-03 19:00:00 |
北海道 |
北海道新聞 |
国内のコロナ重症者、1084人 2日連続最多、変異株拡大が影響 |
https://www.hokkaido-np.co.jp/article/540141/
|
厚生労働省 |
2021-05-03 18:10:00 |
北海道 |
北海道新聞 |
大阪、兵庫はコロナ療養者が倍増 7都府県も逼迫の恐れ |
https://www.hokkaido-np.co.jp/article/540140/
|
新型コロナウイルス |
2021-05-03 18:07:00 |
コメント
コメントを投稿