IT |
気になる、記になる… |
Apple、今秋に同社史上最も多彩な新製品を発表か |
https://taisy0.com/2022/01/24/151091.html
|
apple |
2022-01-23 15:15:44 |
python |
Pythonタグが付けられた新着投稿 - Qiita |
JSONファイルをCSV変換するスクリプトのメモ |
https://qiita.com/suo-takefumi/items/8a43b39a68365a9c0b9f
|
JSONファイルをCSV変換するスクリプトのメモはじめにJSONファイルをCSV変換するスクリプトの作成メモ作成したスクリプトは種類サンプル④JSONファイルをCSV形式に変換してファイル出力convertFromJsonToCsvpyサンプル④JSONファイルをCSV形式に変換してファイル出力convertFromJsonToCsvpy変換規則サンプル④サンプル④スクリプトconvertFromJsonToCsvpyconvertFromJsonToCsvpyCSV変換後のカラム数JSONの要素と要素の値のつJSONの要素と要素の値のつカラムJSONの要素について先頭に番号を付与ネストする全ての要素をで連結先頭に番号を付与しないネストする全ての要素をで連結カラム要素の値についてJSON要素の値と同じ作成したJSONの要素と一致する値が複数あれば、それらをで連結変換対象のJSONファイルをスクリプト実行時に指定できるようオプション指定するなどの作りこみは未実施。 |
2022-01-24 00:59:17 |
python |
Pythonタグが付けられた新着投稿 - Qiita |
rqt custum pluginの作り方 〜単独実行を添えて〜 |
https://qiita.com/DaiGuard/items/9f6f6a1ada18bb7630a4
|
catkinws└ーsrc└ーmyrqtplugin├ーsrc├ーCMakeListstxt├ーpackagexml├ーmanifestxmlltここに作成する└ーpluginxmlltここに作成するプラグインスクリプトを作成するpluginxmlにscriptsフォルダ内にmyrqtpluginmyrqtpluginMyRqtPluginなるプラグインを作成したと記述したので以下のようなファイルを作成する。 |
2022-01-24 00:17:25 |
js |
JavaScriptタグが付けられた新着投稿 - Qiita |
JavaScriptで出会った無名関数について知りたい |
https://qiita.com/choco_0083/items/5536afc9a411004fe267
|
javascript |
2022-01-24 00:51:29 |
js |
JavaScriptタグが付けられた新着投稿 - Qiita |
[JS]配列の繰り返し処理 |
https://qiita.com/Shi-raCanth/items/e335e804e283ff31acaf
|
JSの繰り返し処理for文を使う場合constfruitsappleorangebananaforletiiltfruitslengthiconsolelogifruitsi出力結果appleorangebanana条件式に「配列の要素数よりiが小さい場合に実行上記は、より小さい場合」として処理すると、繰り返し処理が配列の要素分実行される。 |
2022-01-24 00:36:26 |
js |
JavaScriptタグが付けられた新着投稿 - Qiita |
webpackからviteへreplaceしてみた |
https://qiita.com/rei1011/items/5458d9d0a623e5735be2
|
viteの概要についての記事は数あれど、replaceするといった話は少ないように感じたので、本稿ではwebpackからviteへreplaceするために必要な手順について解説していきたいと思います。 |
2022-01-24 00:13:47 |
AWS |
AWSタグが付けられた新着投稿 - Qiita |
JSONファイルをCSV変換するスクリプトのメモ |
https://qiita.com/suo-takefumi/items/8a43b39a68365a9c0b9f
|
JSONファイルをCSV変換するスクリプトのメモはじめにJSONファイルをCSV変換するスクリプトの作成メモ作成したスクリプトは種類サンプル④JSONファイルをCSV形式に変換してファイル出力convertFromJsonToCsvpyサンプル④JSONファイルをCSV形式に変換してファイル出力convertFromJsonToCsvpy変換規則サンプル④サンプル④スクリプトconvertFromJsonToCsvpyconvertFromJsonToCsvpyCSV変換後のカラム数JSONの要素と要素の値のつJSONの要素と要素の値のつカラムJSONの要素について先頭に番号を付与ネストする全ての要素をで連結先頭に番号を付与しないネストする全ての要素をで連結カラム要素の値についてJSON要素の値と同じ作成したJSONの要素と一致する値が複数あれば、それらをで連結変換対象のJSONファイルをスクリプト実行時に指定できるようオプション指定するなどの作りこみは未実施。 |
2022-01-24 00:59:17 |
AWS |
AWSタグが付けられた新着投稿 - Qiita |
WAFで特定ドメインのIP制限をする(Terraform) |
https://qiita.com/hengineer/items/aa4a7371e9a10e113a4b
|
WAFで特定ドメインのIP制限をするTerraformユースケース台のサーバーで管理画面アプリケーションと外部公開用APIサーバーの両機能が備わっている場合などapiに制限はかけたくないが、管理画面にかけたいというシチュエーションどーやるの️流れipのホワイトリストを作成awswafvipsetWebACLを作成awswafvwebaclawswafvwebaclのリソースの定義において、「特定のipは許可する」、「特定のドメインへのアクセスをブロックする」という定義をrulesブロックにて、priorityの重み順に実装する。 |
2022-01-24 00:24:07 |
Docker |
dockerタグが付けられた新着投稿 - Qiita |
AtCoder-cli 環境をDocker化した話 |
https://qiita.com/oura-hideyoshi/items/b77ef5db42b7d33044c0
|
AtCodercli環境をDocker化した話はじめに表題の通りであるAtCoder参加回数数回のひよっこであるが、テスト環境をDockerの練習ついでに仮想化してみたDockerがまだしっかりわかっておらず、NodePythonの、つの環境を交えたものの作り方がわからなかったので、下記の方のimageをベースとして作成した。 |
2022-01-24 00:44:17 |
Git |
Gitタグが付けられた新着投稿 - Qiita |
Git開発初心者がハマりがちなGitアンチパターンと対策 |
https://qiita.com/Colonel_GTU/items/0dc8300c22e1eb158df2
|
特にテンポラリのコミットでない場合かつブランチを切っている場合は他の作業者に状況を見せる意味でもローカルにコミットを保持し続けず、pushはマメに行うことを徹底してもらいましょう。 |
2022-01-24 00:02:01 |
Ruby |
Railsタグが付けられた新着投稿 - Qiita |
CypressOnRailsについて簡単にまとめてみた |
https://qiita.com/k0529m/items/b8333eb0cf41f588f2bb
|
インストール方法gemを導入してからテストの実行・テスト結果を確認するダッシュボードを開くところまでとても簡単に進められます。 |
2022-01-24 00:57:45 |
Ruby |
Railsタグが付けられた新着投稿 - Qiita |
[JS]配列の繰り返し処理 |
https://qiita.com/Shi-raCanth/items/e335e804e283ff31acaf
|
JSの繰り返し処理for文を使う場合constfruitsappleorangebananaforletiiltfruitslengthiconsolelogifruitsi出力結果appleorangebanana条件式に「配列の要素数よりiが小さい場合に実行上記は、より小さい場合」として処理すると、繰り返し処理が配列の要素分実行される。 |
2022-01-24 00:36:26 |
海外TECH |
MakeUseOf |
How to Share and Protect Your Google Sheets |
https://www.makeuseof.com/how-to-share-protect-google-sheets/
|
google |
2022-01-23 15:45:21 |
海外TECH |
MakeUseOf |
What’s New in KDE Plasma 5.24: 5 Major Improvements to Expect |
https://www.makeuseof.com/kde-plasma-5-24-new-features/
|
expectthe |
2022-01-23 15:31:42 |
海外TECH |
MakeUseOf |
What Is AMD's FSR and What Does it Do? |
https://www.makeuseof.com/amd-fsr-explained/
|
resolution |
2022-01-23 15:01:43 |
海外TECH |
DEV Community |
Share Text Across Near 💻Devices📱 using this website 🔥 |
https://dev.to/rajeshj3/share-text-across-near-devices-using-this-website-23hh
|
Share Text Across Near Devicesusing this website Sharing Text data across near devices has always been a headache Some conventional methods to share text data are Using native cross platform applications eg WhatsApp WeChat Telegram etc Or Using Email Services Gmail Yahoo Mail etc All such conventional methods needs either installation of Native Applications or Bulky Sites Solution TEMP SHARE a fast reliable and secure ️web application TEMP SHARE meets all your requirements of sharing text data across near devices Unlimited Text Length ️Secure characters alpha numeric random password Your Custom Password Max Data Persistence Life of minutes One Time Read Only Most important Dark Light Themes How to use Step Just visit temp share mlStep Enter the Text you want to share and hit SUBMITSTEP You ll get a characters alpha numeric random password Step Open the same website temp share ml on another device and Paste the passcode Step Just hit GET TEXT Here you go you have the text on your other device Note As we checked One Time View fetching the text again will return an error Other than this You can also set your custom password Live TEMP SHARE is currently Live at temp share ml Contribution TEMP SHARE is not Open Souce yet But if you want us to Open the Source please drop a comment explaining why you are interested I hope you guys liked this quick introduction to TEMP SHARE If so then please don t forget to drop a Like ️And also help me reach k Subscribers on my YouTube channel Happy Coding |
2022-01-23 15:25:22 |
海外TECH |
DEV Community |
Goodbye Dockerfiles: Build Secure & Optimised Node.js Container Images with Cloud Native Buildpacks |
https://dev.to/pmbanugo/goodbye-dockerfiles-build-secure-optimised-nodejs-container-images-with-cloud-native-buildpacks-489p
|
Goodbye Dockerfiles Build Secure amp Optimised Node js Container Images with Cloud Native BuildpacksDocker enables developers to easily package share and run applications As a platform it has shaped the way we build and run applications and containers have become the de facto standard to run applications A container image is a lightweight standalone executable package of software that includes everything needed to run an application code runtime system libraries and settings You need a Dockerfile to create an image When you tell Docker or any similar tool to build an image by executing the docker build command it reads the instructions in the Dockerfile executes them and creates an image as a result Writing Dockerfiles is one thing and writing Dockerfiles that are optimised for fast build and secure image as output is another thing If you re not careful you may create images that take a long time to build Aside from the time it takes to build the image they may also not be secure You can learn how to secure and optimise your container image of course But wouldn t you rather invest your time and resources in writing code and delegate the task of creating optimised images to a different tool That s where Cloud Native Buildpacks can help What are Cloud Native Buildpacks Cloud Native Buildpacks are pluggable modular tools that transform application source code into container images Its job is to collect everything your app needs to build and run They replace Dockerfiles in the app development lifecycle and enable swift rebasing of images and modular control over images through the use of builders among other benefits How do they work Buildpacks examine your app s to determine all the dependencies it needs and how to run it then package it up as a runnable container image Typically you run your source code through one or more buildpacks Each buildpack goes through two phases the detect phase and the build phase The detect phase runs against your source code to determine if a buildpack is applicable or not If the buildpack is detected to be applicable it proceeds tothe build stage If the project fails detection the build stage for that specific buildpack is skipped The build phase runs against your source code to download dependencies and compile your source code if needed and also set the appropriate entry point and startup scripts Let s see how to create an image using the pack CLI Building your first imageYou re going to build your first image using the pack CLI Go to buildpacks io docs tools pack and follow the instruction for your OS to install it You re going to create and deploy a Node js web app that will return a string Run the command below to create the project and install micro an HTTP library for building microservices bashmkdir micro app amp amp cd micro app amp amp npm init y amp amp npm i microCreate a file named index js Copy and paste the function below in it javascriptmodule exports gt Hello Buildpacks Update your package json with the following start scriptjson start micro That s all you need for the service Run the command below to create an image bashpack build micro builder paketobuildpacks builder baseThat command should create an image using the paketobuildpacks builder base builder A builder is an image that contains all the components necessary to execute a build There are different builder image from Paketo Heroku and a few others You can even create yours or extend an existing one If you use Heroku then your app is making use of Buildpacks without you being aware of it You can choose to build your images using Heroku buildpacks so you can have the same image when you deploy to Heroku or other platforms The image is built and you can try it out by running it with Docker or Podman if that s what you use Run the docker command below to start the app Go to localhost in your browser You should get the text Hello Buildpacks as a response Usage in CI CDYou can build images with Cloud Native Buildpacks and pack CLI in your continuous integration pipeline With GitHub Actions there s a Pack Docker Action that you can use When you combine it with Docker Login Action you can build and publish to a registry in your workflow There s a similar process on GitLab if you use GitLab s Auto DevOps I won t go into details on how to use Buildpacks in different CI systems but you can check the links below Auto Build using Cloud Native Buildpacks in GitLab Pack Docker GitHub Action It can be combined with Docker Login Tekton Buildpacks task available on Tekton Hub It doesn t require the pack CLI or Docker |
2022-01-23 15:21:04 |
海外TECH |
DEV Community |
Multi environment AZURE deployments with Terraform and GitHub |
https://dev.to/pwd9000/multi-environment-azure-deployments-with-terraform-and-github-2450
|
Multi environment AZURE deployments with Terraform and GitHub OverviewThis tutorial uses examples from the following GitHub demo project template repository I have been wanting to do a tutorial to demonstrate how to perform large scale terraform deployments in Azure using a non monolithic approach I have seen so many large deployments fall into this same trap of using one big monolithic configuration when doing deployments at scale Throwing everything into one unwieldy configuration can be troublesome for many reasons To name a few Making a small change can potentially break something much larger somewhere else in the configuration unintentionally Build time aka terraform plan apply is increased A tiny change can take a long time to run as the entire state is checked It can become cumbersome and complex for a team or team members to understand the entire code base Module and provider versioning and dependencies can be fairly confusing to debug in this paradigm and may become restrictive It becomes unmanageable risky and time consuming to plan and implement any changes There s also many blogs and tutorials out there on how to integrate Terraform with DevOps CI CD processes using Azure DevOps So I decided to share with you today how to use Terraform with GitHub instead In this tutorial we will use GitHub reusable workflows and GitHub environments to build enterprise scale multi environment infrastructure deployments in Azure using a non monolithic approach to construct and simplify complex terraform deployments into simpler manageable work streams that can be updated independently increase build time and reduce duplicate workflow code by utilizing reusable GitHub workflows Things you will get out of this tutorial Learn about GitHub reusable workflows Learn how to integrate terraform deployments with CI CD using GitHub Learn how to deploy resources in AZURE at scale Learn about multi stage deployments and approvals using GitHub Environments Hopefully you can even utilize these concepts in your own organization to build AZURE Infrastructure at scale in your own awesome cloud projects Pre RequisitesTo start things off we will build a few pre requisites that is needed to integrate our GitHub project and workflows with AZURE before we can start building resources We are going to perform the following steps Create Azure Resources Terraform Backend Optional We will first create a few resources that will host our terraform backend state configuration We will need a Resource Group Storage Account and KeyVault We will also create an Azure Active Directory App amp Service Principal that will have access to our Terraform backend and subscription in Azure We will link this Service Principal with our GitHub project and workflows later in the tutorial Create a GitHub Repository We will create a GitHub project and set up the relevant secrets and environments that we will be using The project will host our workflows and terraform configurations Create Terraform Modules Modular We will set up a few terraform ROOT modules Separated and modular from each other non monolithic Create GitHub Workflows After we have our repository and terraform ROOT modules configured we will create our reusable workflows and configure multi stage deployments to run and deploy resources in Azure based on our terraform ROOT Modules Create Azure resources Terraform Backend To set up the resources that will act as our Terraform backend I wrote a PowerShell script using AZ CLI that will build and configure everything and store the relevant details secrets we need to link our GitHub project in a key vault You can find the script on my github code page called AZ GH TF Pre Reqs ps First we will log into Azure by running az loginAfter logging into Azure and selecting the subscription we can run the script that will create all the pre requirements we ll need code AZ GH TF Pre Reqs ps Log into Azure az login Setup Variables randomInt Get Random Maximum subscriptionId get azcontext Subscription Id resourceGroupName Demo Terraform Core Backend RG storageName tfcorebackendsa randomInt kvName tf core backend kv randomInt appName tf core github SPN randomInt region uksouth Create a resource resourceGroupNameaz group create name resourceGroupName location region Create a Key Vaultaz keyvault create name kvName resource group resourceGroupName location region enable rbac authorization Authorize the operation to create a few secrets Signed in User Key Vault Secrets Officer az ad signed in user show query objectId o tsv foreach object az role assignment create role Key Vault Secrets Officer assignee scope subscriptions subscriptionId resourceGroups resourceGroupName providers Microsoft KeyVault vaults kvName Create an azure storage account Terraform Backend Storage Accountaz storage account create name storageName location region resource group resourceGroupName sku Standard LRS kind StorageV https only true min tls version TLS Authorize the operation to create the container Signed in User Storage Blob Data Contributor Role az ad signed in user show query objectId o tsv foreach object az role assignment create role Storage Blob Data Contributor assignee scope subscriptions subscriptionId resourceGroups resourceGroupName providers Microsoft Storage storageAccounts storageName Create Upload container in storage account to store terraform state filesStart Sleep s az storage container create account name storageName name tfstate auth mode login Create Terraform Service Principal and assign RBAC Role on Key Vault spnJSON az ad sp create for rbac name appName role Key Vault Secrets Officer scopes subscriptions subscriptionId resourceGroups resourceGroupName providers Microsoft KeyVault vaults kvName Save new Terraform Service Principal details to key vault spnObj spnJSON ConvertFrom Jsonforeach object properties in spnObj psobject properties If object properties Name eq appId null az keyvault secret set vault name kvName name ARM CLIENT ID value object properties Value If object properties Name eq password null az keyvault secret set vault name kvName name ARM CLIENT SECRET value object properties Value If object properties Name eq tenant null az keyvault secret set vault name kvName name ARM TENANT ID value object properties Value null az keyvault secret set vault name kvName name ARM SUBSCRIPTION ID value subscriptionId Assign additional RBAC role to Terraform Service Principal Subscription as Contributor and access to backend storageaz ad sp list display name appName query appId o tsv ForEach Object az role assignment create assignee role Contributor subscription subscriptionId az role assignment create assignee role Storage Blob Data Contributor scope subscriptions subscriptionId resourceGroups resourceGroupName providers Microsoft Storage storageAccounts storageName Lets take a closer look step by step what the above script does as part of setting up the Terraform backend environment Create a resource group called Demo Terraform Core Backend RG containing an Azure key vault and storage account Create an AAD App and Service Principal that has access to the key vault backend storage account container and the subscription The AAD App and Service Principal details are saved inside the key vault Create a GitHub RepositoryFor this step I actually created a template repository that contains everything to get started Feel free to create your repository from my template by selecting Use this template Optional After creating the GitHub repository there are a few things we do need to set on the repository before we can start using it Add the secrets that was created in the Key Vault step above into the newly created GitHub repository as Repository SecretsCreate the following GitHub Environments Or environments that matches your own requirements In my case these are Development UserAcceptanceTesting Production Note that GitHub environments are available on public repos but for private repos you will need GitHub Enterprise Also note that on my Production environment I have set a Required Reviewer This will basically allow me to set explicit reviewers that have to physically approve deployments to the Production environment To learn more about approvals see Environment Protection Rules NOTE You can also configure GitHub Secrets at the Environment scope if you have separate Service Principals or even separate Subscriptions in Azure for each Environment Example Your Development resources are in subscription A and your Production resources are in Subscription B See Creating encrypted secrets for an environment for details Create Terraform Modules Modular Now that our repository is all configured and ready to go we can start to create some modular terraform configurations or in other words separate independent deployment configurations based on ROOT terraform modules If you look at the Demo Repository you will see that on the root of the repository I have paths folders that are numbered e g Foundation and Storage These paths each contain a terraform ROOT module which consists of a collection of items that can independently be configured and deployed You do not have to use the same naming numbering as I have chosen but the idea is to understand that these paths folders each represent a unique independent modular terraform configuration that consists of a collection of resources that we want to deploy independently So in this example path Foundation contains the terraform ROOT module configuration of an Azure Resource Group and key vault path Storage contains the terraform ROOT module configuration for one General V and one Data Lake V Storage storage account NOTE You will also notice that each ROOT module contains x separate TFVARS files config dev tfvars config uat tfvars and config prod tfvars Each representing an environment This is because each of my environments will use the same configuration foundation resources tf but may have slightly different configuration values or naming Example The Development resource group name will be called Demo Infra Dev Rg whereas the Production resource group will be called Demo Infra Prod Rg Create GitHub WorkflowsNext we will create a special folder path structure in the root of our repository called github workflows This folder path will contain our GitHub Action Workflows You will notice that there are numbered workflows github workflows Foundation yml and github workflows Storage yml these are caller workflows Each caller workflow represents a terraform module and is named the same as the path containing the ROOT terraform module as described in the section above There are also x GitHub Reusable Workflows called github workflows az tf plan yml and github workflows az tf apply yml Let s take a closer look at the reusable workflows az tf plan yml This workflow is a reusable workflow to plan a terraform deployment create an artifact and upload that artifact to workspace artifacts for consumption code az tf plan yml Reusable workflow to plan terraform deployment create artifact and upload to workspace artifacts for consumption name Build TF Plan on workflow call inputs path description Specifies the path of the root terraform module required true type string tf version description Specifies version of Terraform to use e g Default latest required false type string default latest az resource group description Specifies the Azure Resource Group where the backend storage account is hosted required true type string az storage acc description Specifies the Azure Storage Account where the backend state is hosted required true type string az container name description Specifies the Azure Storage account container where backend Terraform state is hosted required true type string tf key description Specifies the Terraform state file name for this plan required true type string gh environment description Specifies the GitHub deployment environment required false type string default null tf vars file description Specifies the Terraform TFVARS file required true type string secrets arm client id description Specifies the Azure ARM CLIENT ID required true arm client secret description Specifies the Azure ARM CLIENT SECRET required true arm subscription id description Specifies the Azure ARM SUBSCRIPTION ID required true arm tenant id description Specifies the Azure ARM TENANT ID required truejobs build plan runs on ubuntu latest environment inputs gh environment defaults run shell bash working directory inputs path env STORAGE ACCOUNT inputs az storage acc CONTAINER NAME inputs az container name RESOURCE GROUP inputs az resource group TF KEY inputs tf key tfstate TF VARS inputs tf vars file AZURE Client details ARM CLIENT ID secrets arm client id ARM CLIENT SECRET secrets arm client secret ARM SUBSCRIPTION ID secrets arm subscription id ARM TENANT ID secrets arm tenant id steps name Checkout uses actions checkout v name Setup Terraform uses hashicorp setup terraform v with terraform version inputs tf version name Terraform Format id fmt run terraform fmt check name Terraform Init id init run terraform init backend config storage account name STORAGE ACCOUNT backend config container name CONTAINER NAME backend config resource group name RESOURCE GROUP backend config key TF KEY name Terraform Validate id validate run terraform validate name Terraform Plan id plan run terraform plan var file TF VARS out plan tfplan continue on error true name Terraform Plan Status if steps plan outcome failure run exit name Compress TF Plan artifact run zip r inputs tf key zip name Upload Artifact uses actions upload artifact v with name inputs tf key path inputs path inputs tf key zip retention days NOTE The reusable workflow can only be triggered by another workflow aka the caller workflows We can see this by the on trigger called workflow call code az tf plan yml L Lon workflow call As you can see the reusable workflow can be given specific inputs when called by the caller workflow Notice that one of the inputs are called path which we can use to specify the path of the ROOT terraform module that we want to plan and deploy InputsRequiredDescriptionDefaultpathTrueSpecifies the path of the root terraform module tf versionFalseSpecifies version of Terraform to use e g Default latest latestaz resource groupTrueSpecifies the Azure Resource Group where the backend storage account is hosted az storage accTrueSpecifies the Azure Storage Account where the backend state is hosted az container nameTrueSpecifies the Azure Storage account container where backend Terraform state is hosted tf keyTrueSpecifies the Terraform state file name for this plan gh environmentFalseSpecifies the GitHub deployment environment nulltf vars fileTrueSpecifies the Terraform TFVARS file We aso need to pass some secrets from the caller to the reusable workflow This is the details of our Service Principal we created to have access in Azure and is linked with our GitHub Repository Secrets we configured earlier SecretRequiredDescriptionarm client idTrueSpecifies the Azure ARM CLIENT ID arm client secretTrueSpecifies the Azure ARM CLIENT SECRET arm subscription idTrueSpecifies the Azure ARM SUBSCRIPTION ID arm tenant idTrueSpecifies the Azure ARM TENANT ID This workflow when called will perform the following steps Check out the code repository and set the path context given as input to the path containing the terraform module Install and use the version of terraform as per the input Format check the terraform module code Initialize the terraform module in the given path Validate the terraform module in the given path Create a terraform plan based on the given TFVARS file specified at input Compress the plan artifacts Upload the compressed plan as a workflow artifact Let s take a look at our second reusable workflow az tf apply yml This workflow is a reusable workflow to download a terraform artifact built by az tf plan yml and apply the artifact plan Deploy the planned terraform configuration code az tf apply yml Reusable workflow to download terraform artifact built by az tf plan and apply the artifact plan name Apply TF Plan on workflow call inputs path description Specifies the path of the root terraform module required true type string tf version description Specifies version of Terraform to use e g Default latest required false type string default latest az resource group description Specifies the Azure Resource Group where the backend storage account is hosted required true type string az storage acc description Specifies the Azure Storage Account where the backend state is hosted required true type string az container name description Specifies the Azure Storage account container where backend Terraform state is hosted required true type string tf key description Specifies the Terraform state file name for this plan required true type string gh environment description Specifies the GitHub deployment environment required false type string default null tf vars file description Specifies the Terraform TFVARS file required true type string secrets arm client id description Specifies the Azure ARM CLIENT ID required true arm client secret description Specifies the Azure ARM CLIENT SECRET required true arm subscription id description Specifies the Azure ARM SUBSCRIPTION ID required true arm tenant id description Specifies the Azure ARM TENANT ID required truejobs apply plan runs on ubuntu latest environment inputs gh environment defaults run shell bash working directory inputs path env STORAGE ACCOUNT inputs az storage acc CONTAINER NAME inputs az container name RESOURCE GROUP inputs az resource group TF KEY inputs tf key tfstate TF VARS inputs tf vars file AZURE Client details ARM CLIENT ID secrets arm client id ARM CLIENT SECRET secrets arm client secret ARM SUBSCRIPTION ID secrets arm subscription id ARM TENANT ID secrets arm tenant id steps name Download Artifact uses actions download artifact v with name inputs tf key path inputs path name Decompress TF Plan artifact run unzip inputs tf key zip name Setup Terraform uses hashicorp setup terraform v with terraform version inputs tf version name Terraform Init id init run terraform init backend config storage account name STORAGE ACCOUNT backend config container name CONTAINER NAME backend config resource group name RESOURCE GROUP backend config key TF KEY name Terraform Apply run terraform apply var file TF VARS auto approveThe inputs and secrets are the same as our previous reusable workflow which created the terraform plan InputsRequiredDescriptionDefaultpathTrueSpecifies the path of the root terraform module tf versionFalseSpecifies version of Terraform to use e g Default latest latestaz resource groupTrueSpecifies the Azure Resource Group where the backend storage account is hosted az storage accTrueSpecifies the Azure Storage Account where the backend state is hosted az container nameTrueSpecifies the Azure Storage account container where backend Terraform state is hosted tf keyTrueSpecifies the Terraform state file name for this plan gh environmentFalseSpecifies the GitHub deployment environment nulltf vars fileTrueSpecifies the Terraform TFVARS file SecretRequiredDescriptionarm client idTrueSpecifies the Azure ARM CLIENT ID arm client secretTrueSpecifies the Azure ARM CLIENT SECRET arm subscription idTrueSpecifies the Azure ARM SUBSCRIPTION ID arm tenant idTrueSpecifies the Azure ARM TENANT ID This workflow when called will perform the following steps Download the terraform plan workflow artifact Decompress the terraform plan workflow artifact Install and use the version of terraform as per the input Re initialize the terraform module Apply the terraform configuration based on the terraform plan and values in the TFVARS file Let s take a look at one of the caller workflows next These workflows will be used to call the reusable workflows Foundation yml This workflow is a Caller workflow It will call and trigger a reusable workflow az tf plan yml and create a foundational terraform deployment PLAN based on the repository path Foundation containing the terraform ROOT module configuration of an Azure Resource Group and key vault The plan artifacts are validated compressed and uploaded into the workflow artifacts the caller workflow Foundation will then call and trigger the second reusable workflow az tf apply yml that will download and decompress the PLAN artifact and trigger the deployment based on the plan Also demonstrated is how to use GitHub Environments to do multi staged environment based deployments with approvals Optional code Foundation ymlname Foundation on workflow dispatch pull request branches masterjobs Plan Dev if github ref refs heads master amp amp github event name pull request uses Pwd ML Azure Terraform Deployments github workflows az tf plan yml master with path Foundation Path to terraform root module Required tf version latest Terraform version e g Default latest Optional az resource group your resource group name AZ backend AZURE Resource Group hosting terraform backend storage acc Required az storage acc your storage account name AZ backend AZURE terraform backend storage acc Required az container name your sa container name AZ backend AZURE storage container hosting state files Required tf key foundation dev AZ backend Specifies name that will be given to terraform state file Required tf vars file config dev tfvars Terraform TFVARS Required secrets arm client id secrets ARM CLIENT ID ARM Client ID arm client secret secrets ARM CLIENT SECRET ARM Client Secret arm subscription id secrets ARM SUBSCRIPTION ID ARM Subscription ID arm tenant id secrets ARM TENANT ID ARM Tenant ID Deploy Dev needs Plan Dev uses Pwd ML Azure Terraform Deployments github workflows az tf apply yml master with path Foundation Path to terraform root module Required tf version latest Terraform version e g Default latest Optional az resource group your resource group name AZ backend AZURE Resource Group hosting terraform backend storage acc Required az storage acc your storage account name AZ backend AZURE terraform backend storage acc Required az container name your sa container name AZ backend AZURE storage container hosting state files Required tf key foundation dev AZ backend Specifies name that will be given to terraform state file Required gh environment Development GH Environment Default null Optional tf vars file config dev tfvars Terraform TFVARS Required secrets arm client id secrets ARM CLIENT ID ARM Client ID arm client secret secrets ARM CLIENT SECRET ARM Client Secret arm subscription id secrets ARM SUBSCRIPTION ID ARM Subscription ID arm tenant id secrets ARM TENANT ID ARM Tenant ID Plan Uat if github ref refs heads master amp amp github event name pull request uses Pwd ML Azure Terraform Deployments github workflows az tf plan yml master with path Foundation az resource group your resource group name az storage acc your storage account name az container name your sa container name tf key foundation uat tf vars file config uat tfvars secrets arm client id secrets ARM CLIENT ID arm client secret secrets ARM CLIENT SECRET arm subscription id secrets ARM SUBSCRIPTION ID arm tenant id secrets ARM TENANT ID Deploy Uat needs Plan Uat Deploy Dev uses Pwd ML Azure Terraform Deployments github workflows az tf apply yml master with path Foundation az resource group your resource group name az storage acc your storage account name az container name your sa container name tf key foundation uat gh environment UserAcceptanceTesting tf vars file config uat tfvars secrets arm client id secrets ARM CLIENT ID arm client secret secrets ARM CLIENT SECRET arm subscription id secrets ARM SUBSCRIPTION ID arm tenant id secrets ARM TENANT ID Plan Prod if github ref refs heads master amp amp github event name pull request uses Pwd ML Azure Terraform Deployments github workflows az tf plan yml master with path Foundation tf version latest az resource group your resource group name az storage acc your storage account name az container name your sa container name tf key foundation prod tf vars file config prod tfvars secrets arm client id secrets ARM CLIENT ID arm client secret secrets ARM CLIENT SECRET arm subscription id secrets ARM SUBSCRIPTION ID arm tenant id secrets ARM TENANT ID Deploy Prod needs Plan Prod Deploy Uat uses Pwd ML Azure Terraform Deployments github workflows az tf apply yml master with path Foundation az resource group your resource group name az storage acc your storage account name az container name your sa container name tf key foundation prod gh environment Production tf vars file config prod tfvars secrets arm client id secrets ARM CLIENT ID arm client secret secrets ARM CLIENT SECRET arm subscription id secrets ARM SUBSCRIPTION ID arm tenant id secrets ARM TENANT ID Notice that we have multiple jobs in the caller workflow one job to generate a terraform plan and one job to deploy the plan per environment You will see that each plan job uses the different TFVARS files config dev tfvars config uat tfvars and config prod tfvars respectively of each environment but using the same ROOT module configuration in the path Foundation foundation resources tf Each reusable workflows inputs are specified on the caller workflows jobs using with and Secrets using secret You will also note that only the Deploy jobs Deploy Dev Deploy Uat Deploy Prod are linked with an input gh environment which specifies which GitHub environment the job is linked to Each Plan jobs Plan Dev Plan Uat Plan Prod are not linked to any GitHub Environment Each Deploy jobs Deploy Dev Deploy Uat Deploy Prod are also linked with the relevant needs setting of it s corresponding plan This means that the plan job must be successful before the deploy job can initialize and run Deploy jobs are also linked with earlier deploy jobs using needs so that Dev gets built first and if successful be followed by Uat and if successful followed by Prod However if you remember we configured a GitHub Protection Rule on our Production environment which needs to be approved before it can run NOTE if you have been following this tutorial step by step and used a cloned copy of the Demo Repository you will need to update the caller workflows github workflows Foundation yml and github workflows Storage yml with the inputs specified under with using the values of your environment TestingLet s run the workflow Foundation and see what happens After the run you will see that each plan was created and DEV as well as UAT terraform configurations have been deployed to Azure as per the terraform configuration under path Foundation After approving Production we can see that approval has triggered the production deployment and now we also have a production resource group You will notice that each resource group contains a key vault as per our foundation terraform configuration under path Foundation Let s run the workflow Storage and after deploying DEV and UAT also approve PRODUCTION to run Now you will notice that each of our environments resource groups also contains storage accounts as per the terraform configuration under path Storage Lastly if we navigate to the terraform backend storage account you will see that based on the tf key inputs we gave each of our caller workflow jobs each terraform deployment has its own state file per ROOT module collection per environment which nicely segregates the terraform configuration state files independently from each other I hope you have enjoyed this post and have learned something new You can find the code samples used in this blog post on my Github page You can also look at the demo project or even create your own projects and workflows from the demo project template repository ️ AuthorLike share follow me on GitHub Twitter LinkedIn ltag user id follow action button background color cbb important color important border color cbb important Marcel LFollow Cloud Solutions amp DevOps Architect |
2022-01-23 15:14:24 |
海外TECH |
DEV Community |
build a snake game using canvas and requestAnimationFrame |
https://dev.to/p4nghu/build-a-snake-game-using-canvas-and-requestanimationframe-3iff
|
build a snake game using canvas and requestAnimationFramethis project is inspired by dan s streaming but implement my way im not good at english so i just show my code and comments if you found this is helpful please give me a star live demogithubblog data structure and variablesconst canvas document getElementById canvas const ctx canvas getContext d const width const height const cellLength let foodPositionlet initSnake let snake initSnake let direction right let canChangeDirection true canvas backgroundfunction drawBackground ctx strokeStyle bfbfbf for let i i lt height cellLength i ctx beginPath ctx moveTo cellLength i ctx lineTo width cellLength i ctx stroke for let i i lt width cellLength i ctx beginPath ctx moveTo cellLength i ctx lineTo cellLength i height ctx stroke snakefunction drawSnake let step snake length for let i i lt snake length i gradient color const percent Math min step i ctx fillStyle hsl percent ctx fillRect snake i cellLength snake i cellLength cellLength cellLength draw food random food positionfunction generateRandomFood if no place to generate if snake length gt width height return alert you win const randomX Math floor Math random width cellLength const randomY Math floor Math random height cellLength if the position comflict with snake then re generate for let i i lt snake length i if snake i randomX amp amp snake i randomY return generateRandomFood foodPosition randomX randomY drawfunction drawFood ctx fillStyle ff ctx fillRect foodPosition cellLength foodPosition cellLength cellLength cellLength snake movementfunction snakeMove let next let last snake snake length set new snake head by direction switch direction case up next last last break case down next last last break case left next last last break case right next last last break boundary collision const boundary next lt next gt width cellLength next lt next gt height cellLength self collision const selfCollision snake some x y gt next x amp amp next y if collision restart if boundary selfCollision return restart snake push next if next movement is food push head do not shift if next foodPosition amp amp next foodPosition generateRandomFood return snake shift canChangeDirection true event listenerdocument addEventListener keydown e gt switch e key case ArrowUp if direction down canChangeDirection return direction up canChangeDirection false break case ArrowDown if direction up canChangeDirection return direction down canChangeDirection false break case ArrowLeft if direction right canChangeDirection return direction left canChangeDirection false break case ArrowRight if direction left canChangeDirection return direction right canChangeDirection false break requestAnimationFrame for animate its too fast for this game by default make it slow downfunction animate let count function loop if count gt draw count requestAnimationFrame loop requestAnimationFrame loop fix Bugbecause requestAnimationFrame is async as if snake s direction is right i can change it to top and then left before snake movement so i add canChangeDirection direction can change only after snake moved event callbackcase ArrowUp if direction down canChangeDirection return direction up canChangeDirection false break |
2022-01-23 15:13:53 |
Apple |
AppleInsider - Frontpage News |
Apple Car engineering manager departs for Meta role |
https://appleinsider.com/articles/22/01/23/apple-car-engineering-manager-departs-for-meta-role?utm_medium=rss
|
Apple Car engineering manager departs for Meta roleThe Apple Car has encountered another setback with the loss of another engineering manager from the vehicle project to Meta The Apple Car has suffered from a consistent turnover of employees joining and leaving the long rumored project and it seems the pattern is set to continue in The head of software engineering for the Apple Car has reportedly exited Apple in favor of a role at a competitor Joe Bass was the Lead Engineering Program Manager for Autonomous Systems at Apple since January As spotted by Mark Gurman in Bloomberg s Power On newsletter he has changed his LinkedIn profile in January showing he left Apple for a new position elsewhere Read more |
2022-01-23 15:21:48 |
海外TECH |
Engadget |
'Dark Souls 3' security hole lets attackers hijack your PC |
https://www.engadget.com/dark-souls-3-security-exploit-hack-152505550.html?src=rss
|
x Dark Souls x security hole lets attackers hijack your PCYou might not want to play a Dark Souls game online for a while ーnot that you necessarily can As Dexerto and The Verge report attackers have discovered a security exploit in Dark Souls and potentially Elden Ring for Windows that lets attackers remotely execute control and effectively hijack your PC Streamers like The Grim Sleeper have learned about the potential damage first hand ーin his case the intruder launched Microsoft PowerShell and ran a text to speech script blasting him for his gameplay The exploiter might not have malicious intent A post on the SpeedSouls Discord claimed the hacker was trying to warn developer FromSoftware about the Dark Souls vulnerability but turned to compromising streamers to highlight the problem Few people beyond the perpetrator are aware of how to use it but there s already a patch for the unofficial Blue Sentinel anti cheat tool FromSoftware and its publisher Bandai Namco have since responded to the exploit They ve temporarily shut down the player versus player servers for Dark Souls and its predecessors while the security team investigates the flaws It s not certain when the servers will go back online but From and Bandai clearly won t restore service until they re reasonably confident players are safe More sinister attackers could use the flaw to steal sensitive information ruin gamers systems and otherwise do serious damage PvP servers for Dark Souls Dark Souls and Dark Souls Remastered have been temporarily deactivated to allow the team to investigate recent reports of an issue with online services Servers for Dark Souls PtDE will join them shortly We apologize for this inconvenience ーDark Souls DarkSoulsGame January |
2022-01-23 15:25:05 |
金融 |
◇◇ 保険デイリーニュース ◇◇(損保担当者必携!) |
保険デイリーニュース(01/24) |
http://www.yanaharu.com/ins/?p=4815
|
三井住友海上 |
2022-01-23 15:28:35 |
ニュース |
BBC News - Home |
Nusrat Ghani: Muslimness a reason for my sacking, says ex-minister |
https://www.bbc.co.uk/news/uk-politics-60100525?at_medium=RSS&at_campaign=KARANGA
|
ghani |
2022-01-23 15:42:12 |
ニュース |
BBC News - Home |
Stretford stabbing: Five murder arrests after boy, 16, dies |
https://www.bbc.co.uk/news/uk-england-manchester-60101908?at_medium=RSS&at_campaign=KARANGA
|
attack |
2022-01-23 15:34:10 |
ニュース |
BBC News - Home |
Inflation: Four things that are going up in price and why |
https://www.bbc.co.uk/news/business-60082564?at_medium=RSS&at_campaign=KARANGA
|
products |
2022-01-23 15:01:28 |
ニュース |
BBC News - Home |
Why are old post boxes suddenly going missing? |
https://www.bbc.co.uk/news/uk-england-suffolk-60075415?at_medium=RSS&at_campaign=KARANGA
|
collectors |
2022-01-23 15:15:25 |
海外TECH |
reddit |
So, how'd you go? What level? Pass or fail? What's your plan now? |
https://www.reddit.com/r/jlpt/comments/savlxj/so_howd_you_go_what_level_pass_or_fail_whats_your/
|
So how x d you go What level Pass or fail What x s your plan now Results for Japan test takers just dropped I m so happy to have passed N The first time I did N I got questions correct and failed with This time I got questions correct and passed with The scoring system is super weird Got for listening which was pretty cool even though I got four questions wrong Anyway I have no plan to take N so now my new goal is to complete WaniKani submitted by u AsahiWeekly to r jlpt link comments |
2022-01-23 15:06:54 |
コメント
コメントを投稿