投稿時間:2022-04-13 19:43:22 RSSフィード2022-04-13 19:00 分まとめ(56件)

カテゴリー等 サイト名等 記事タイトル・トレンドワード等 リンクURL 頻出ワード・要約等/検索ボリューム 登録日
IT 気になる、記になる… iTunes Storeの「今週の映画」、今週は「ストレイ・ドッグ」(レンタル102円) https://taisy0.com/2022/04/13/155767.html apple 2022-04-13 09:43:29
IT ITmedia 総合記事一覧 [ITmedia News] TP-Link製Bluetooth機器でMACアドレスが重複 「他社も同様」と説明 BUFFALO、IODATAは否定 https://www.itmedia.co.jp/news/articles/2204/13/news166.html bluetooth 2022-04-13 18:30:00
IT ITmedia 総合記事一覧 [ITmedia ビジネスオンライン] 7月1日に「星野リゾート BEB5沖縄瀬良垣」がオープン 全室キッチン完備の客室で若者を狙う https://www.itmedia.co.jp/business/articles/2204/13/news164.html itmedia 2022-04-13 18:17:00
IT 情報システムリーダーのためのIT情報専門サイト IT Leaders フォースネット、テレワーク環境から会社のIPアドレスを使って社外に接続可能なVPNサービス | IT Leaders https://it.impress.co.jp/articles/-/23010 フォースネット、テレワーク環境から会社のIPアドレスを使って社外に接続可能なVPNサービスITLeadersVPNや電話システムなどのテレワーク製品を開発・提供するフォースネットは年月日、VPNサービス「おうちワークBOX」を強化すると発表した。 2022-04-13 18:33:00
AWS AWS Japan Blog Amazon GamesのシームレスなMMO「New World」を支えるユニークなアーキテクチャ https://aws.amazon.com/jp/blogs/news/the-unique-architecture-behind-amazon-games-seamless-mmo-new-world/ AmazonECによる広大なMMOワールドの構築開発者は、AmazonElasticComputeCloudAmazonECの異なるインスタンスを用いて構築された独自のアーキテクチャを使用してNewWorldを構築しました。 2022-04-13 09:04:35
python Pythonタグが付けられた新着投稿 - Qiita 小売業 経営企画の視点で機械学習:決定木 https://qiita.com/watanabe-tsubasa/items/f0db5521271f0225292d 機械学習 2022-04-13 18:47:31
python Pythonタグが付けられた新着投稿 - Qiita 【init】Google ColabでLINEおうむ返しBOTを動かす https://qiita.com/chinchilla0413/items/0e15ab63faff61c81506 channelaccesstoken 2022-04-13 18:42:47
Ruby Rubyタグが付けられた新着投稿 - Qiita 【Docker】駆け出しエンジニアがDocker公式チュートリアルに躓きまくった件 https://qiita.com/YUDAI_K/items/3edcc006d919f0bb0bf2 sunasteri 2022-04-13 18:00:49
AWS AWSタグが付けられた新着投稿 - Qiita AWS EC2 AmazonLinux2にMySQL 5.7系をインストールする https://qiita.com/ttabata/items/2f2a5996dd5612b78774 mysqlproductarch 2022-04-13 18:39:00
Docker dockerタグが付けられた新着投稿 - Qiita 【Docker】駆け出しエンジニアがDocker公式チュートリアルに躓きまくった件 https://qiita.com/YUDAI_K/items/3edcc006d919f0bb0bf2 sunasteri 2022-04-13 18:00:49
Git Gitタグが付けられた新着投稿 - Qiita Git と GitHub のセットアップ https://qiita.com/ryo2020/items/8bcae7d3993981c43068 gitforwindows 2022-04-13 18:25:36
Ruby Railsタグが付けられた新着投稿 - Qiita 【Docker】駆け出しエンジニアがDocker公式チュートリアルに躓きまくった件 https://qiita.com/YUDAI_K/items/3edcc006d919f0bb0bf2 sunasteri 2022-04-13 18:00:49
技術ブログ Developers.IO ECSでRakeタスクを実行する https://dev.classmethod.jp/articles/ecs-rails-rake/ console 2022-04-13 09:40:30
海外TECH MakeUseOf How to Try Microsoft Journal, the Notetaking App Built for Styluses https://www.makeuseof.com/try-microsoft-journal/ stylusesmicrosoft 2022-04-13 09:28:43
海外TECH MakeUseOf 6 Ways to Watch YouTube Without Going to YouTube https://www.makeuseof.com/tag/6-ways-to-watch-youtube-without-going-to-youtube/ youtube 2022-04-13 09:20:11
海外TECH MakeUseOf The Best DJ Equipment for Music, Weddings, and Parties https://www.makeuseof.com/tag/best-dj-equipment/ djing 2022-04-13 09:17:29
海外TECH DEV Community Secure AWS deploys from Github Actions with OIDC https://dev.to/aws-builders/secure-aws-deploys-from-github-actions-with-oidc-fhc Secure AWS deploys from Github Actions with OIDCLong gone are the days when you had to keep long lived access keys in your CI CD pipelines to deploy to AWS Learn how to use OIDC OpenID Connect to securely deploy to AWS from Github Actions and how to use GitHub Environments to secure deployments to specific AWS environments IntroductionManaging access from your CI CD systems to your cloud environments in a secure manner can often be a tedious challenge For a long time one common way to do so for AWS was to set up an IAM user for this purpose and then store the access keys for that user in e g GitHub secrets for use with GitHub Actions This required a lot of manual work such as setting up different users for different projects rotating keys after a certain period etc By utilizing OIDC you can configure AWS to trust GitHub as a federated identity provider and then use ID tokens in Github Actions workflows to authenticate to AWS and access resources You can create separate IAM roles for different purposes and allow workflows to assume those roles in a granular way Each job in a GitHub Actions workflow can request an OIDC token from GitHub s OIDC provider The contents of the token are described well in the GitHub documentation Depending on the job s configuration as well as what event triggered the workflow the token will contain a different set of claims One important claim is the sub claim which will be used in our IAM Role trust policies to grant permission for workflows to use a role For example if a workflow is triggered by a push to the main branch of a repository named repo org repo name the subject claim will be set to repo repo org repo name ref refs heads main If a job references a GitHub environment named Production the subject claim will be set to repo repo org repo name environment Production For a full list of possible subject claims check the GitHub documentation Example setupIn this example we will use two different S buckets to mimic two different AWS environments We will create two different workflows one that will be triggered by pushes to the main branch and another that will be triggered by pull requests On pull requests we want to be able to read our environments In a real example this could perhaps be a terraform plan job that could visualize any proposed changes to the environment We will also deploy to the development environment from pull requests to speed up the feedback loop We do however NOT want to deploy to the production environment from pull requests We also need to make sure that contributors cannot update the workflow as part of the pull request to include a deployment to the production environment For pushes to main we want to be able to read and deploy to both environments We will also require manual approval from a repository admin to deploy to the production environment In the example workflows read and deploy operations will be demonstrated by download and upload operations to the respective S buckets Create bucketsStart by creating two buckets in your account I will refer to these below as YOUR DEV BUCKET and YOUR PROD BUCKET Add GitHub as an identity providerTo be able to authenticate with OIDC from GitHub you will first need to set up GitHub as a federated identity provider in your AWS account To do that navigate to the AWS IAM console and click on Identity Providers on the left hand side Then click on the Add provider button For Provider type select OpenID Connect For Provider URL enter Click on Get thumbprint to get the thumbprint of the providerFor Audience enter sts amazonaws com Create roles and policiesNow that GitHub is set up as an identity provider it is time to create the roles and policies that will be assumed by the respective workflows You will create three roles Reader role Will have read permissions on both the buckets This role will be available both for pull request events as well as push events to the main branch Dev Deploy role Will have both read and write permissions to the dev bucket This role will require a workflow to use the Development GitHub environment Prod Deploy role Will have both read and write permissions to the prod bucket This role will require a workflow to use the Production GitHub environment Reader roleStart by creating a new role that will be used to read from the buckets The role should be able to be assumed by both workflows triggered by the pull request and push events To allow this the role will need to have the following trust policy Version Statement Effect Allow Principal Federated arn aws iam oidc provider token actions githubusercontent com Action sts AssumeRoleWithWebIdentity Condition ForAllValues StringEquals token actions githubusercontent com sub repo ORG OR USER NAME REPOSITORY pull request repo ORG OR USER NAME REPOSITORY ref refs heads main token actions githubusercontent com aud sts amazonaws com You need to replace the ORG OR USER NAME and REPOSITORY with your own values For my example repository eliasbrange aws github actions oidc it would be repo eliasbrange aws github actions oidc pull request Next add an IAM policy to this role that grants the role permission to read from the buckets Statement Action s ListBucket Effect Allow Resource arn aws s YOUR DEV BUCKET arn aws s YOUR PROD BUCKET Action s GetObject Effect Allow Resource arn aws s YOUR DEV BUCKET arn aws s YOUR PROD BUCKET Version Development Deploy RoleCreate another role that will be used to deploy to the development environment in this case by uploading a file to the development bucket This role should require the workflow to use a GitHub Environment named Development so it will require the following trust policy Version Statement Effect Allow Principal Federated arn aws iam oidc provider token actions githubusercontent com Action sts AssumeRoleWithWebIdentity Condition ForAllValues StringEquals token actions githubusercontent com sub repo ORG OR USER NAME REPOSITORY environment Development token actions githubusercontent com aud sts amazonaws com Give it the following IAM policy Statement Action s ListBucket Effect Allow Resource arn aws s YOUR DEV BUCKET Action s GetObject s PutObject Effect Allow Resource arn aws s YOUR DEV BUCKET Version Production Deploy RoleCreate the final role which will be used for production deploys It should look similar to the development role but with some changes to the permissions Add the following trust policy Version Statement Effect Allow Principal Federated arn aws iam oidc provider token actions githubusercontent com Action sts AssumeRoleWithWebIdentity Condition ForAllValues StringEquals token actions githubusercontent com sub repo ORG OR USER NAME REPOSITORY environment Production token actions githubusercontent com aud sts amazonaws com Give it the following IAM policy Statement Action s ListBucket Effect Allow Resource arn aws s YOUR PROD BUCKET Action s GetObject s PutObject Effect Allow Resource arn aws s YOUR PROD BUCKET Version Create GitHub EnvironmentsIn your GitHub repository create two environments Development and Production For Development we will not add any safeguards so that we can deploy to it directly from pull requests However for the Production environment we will add a few protection rules In the environment configuration under Environment protection rules tick the Required reviewers box and add your own GitHub username In the environment configuration under Deployment branches pick Selected branches in the dropdown and add main as an allowed branch The above rules will prevent contributors to modify the actual workflow as part of a pull request to deploy to production It will also force workflows that target the Production environment to require manual approval Add GitHub secretsWhile you could use secrets scoped to each environment I want to show that even if the workflow has access to the secret it will not be able to assume roles it shouldn t have access to In your GitHub repository add the following secrets DEV DEPLOY ROLE The ARN of your development deploy role PROD DEPLOY ROLE The ARN of your production deploy role READ ROLE The ARN of your reader role Create workflowsTime to create the deployment workflows You will create a total of three workflows one for pull requests one for pushes to main as well as a bonus workflow for visualizing which workflow event triggers that has access to which IAM roles The workflows will both try to download a file named README md from your buckets as well as upload a file from the repository root named README md to the buckets Either change the workflows to use another file or make sure that you have a README md file in your repository root To allow the first run of the workflows to succeed you also need to upload said file to both buckets Pull request workflowAdd the following workflow to your repository name Pull Request The workflow should only trigger on pull requests to the main branchon pull request branches main Required to get the ID Token that will be used for OIDCpermissions id token writejobs read dev runs on ubuntu latest steps name Checkout uses actions checkout v name Configure AWS Credentials uses aws actions configure aws credentials v with aws region eu west role to assume secrets READ ROLE role session name OIDCSession run aws s cp s YOUR DEV BUCKET README md README md shell bash write dev runs on ubuntu latest needs read dev environment Development steps name Checkout uses actions checkout v name Configure AWS Credentials uses aws actions configure aws credentials v with aws region eu west role to assume secrets DEV DEPLOY ROLE role session name OIDCSession run aws s cp README md s YOUR DEV BUCKET README md shell bash read prod runs on ubuntu latest steps name Checkout uses actions checkout v name Configure AWS Credentials uses aws actions configure aws credentials v with aws region eu west role to assume secrets READ ROLE role session name OIDCSession run aws s cp s YOUR PROD BUCKET README md README md shell bashThis is a very basic workflow that First downloads the README md file from the Development bucket Then uploads the README md from the repository to the Development bucket Finally downloads the README md file from the Production bucket Not a very exciting nor useful workflow but it demonstrates the usage of different roles and environments In both read dev and read prod there is no environment specified so the ID token in these jobs will have a subject claim of repo ORG OR USER NAME REPOSITORY pull request They are also both using the READ ROLE secret which should point to the ARN of your reader IAM role In write dev there is environment Development specified which means that the ID token will have a subject claim of repo ORG OR USER NAME REPOSITORY environment Development The job uses the DEV DEPLOY ROLE secret instead of READ ROLE Push to main workflowAdd the following workflow to your repository name Push The workflow should only trigger on push events to the main branchon push branches main Required to get the ID Token that will be used for OIDCpermissions id token writejobs read dev runs on ubuntu latest steps name Checkout uses actions checkout v name Configure AWS Credentials uses aws actions configure aws credentials v with aws region eu west role to assume secrets READ ROLE role session name OIDCSession run aws s cp s YOUR DEV BUCKET README md README md shell bash write dev runs on ubuntu latest needs read dev environment Development steps name Checkout uses actions checkout v name Configure AWS Credentials uses aws actions configure aws credentials v with aws region eu west role to assume secrets DEV DEPLOY ROLE role session name OIDCSession run aws s cp README md s YOUR DEV BUCKET README md shell bash read prod runs on ubuntu latest steps name Checkout uses actions checkout v name Configure AWS Credentials uses aws actions configure aws credentials v with aws region eu west role to assume secrets READ ROLE role session name OIDCSession run aws s cp s YOUR PROD BUCKET README md README md shell bash write prod needs read prod write dev runs on ubuntu latest environment Production steps name Checkout uses actions checkout v name Configure AWS Credentials uses aws actions configure aws credentials v with aws region eu west role to assume secrets PROD DEPLOY ROLE role session name OIDCSession run aws s cp README md s YOUR PROD BUCKET README md shell bashAlmost the same as the previous workflow but with the addition of a write prod job Download README md from the Development bucket Upload README md from the repository to the Development bucket Download README md from the Production bucket Upload README md from the repository to the Production bucket read prod uses the same setup as read dev i e no environment and both use the READ ROLE secret The write prod job is similar to the write dev job but uses environment Production and the PROD DEPLOY ROLE secret Bonus workflowThis workflow is a bonus workflow that uses a matrix strategy to show the result of different combinations of subject claim environment and role Add the following workflow to your repository name Bonus on pull request branches main push branches mainpermissions id token writejobs read dev strategy fail fast false matrix environment Development Production role secret READ ROLE DEV DEPLOY ROLE PROD DEPLOY ROLE runs on ubuntu latest environment matrix environment steps name Checkout uses actions checkout v name Configure AWS Credentials uses aws actions configure aws credentials v with aws region eu west role to assume secrets matrix role secret role session name OIDCSession run aws s cp s YOUR DEV BUCKET README md README md shell bash write dev strategy fail fast false matrix environment Development Production role secret READ ROLE DEV DEPLOY ROLE PROD DEPLOY ROLE runs on ubuntu latest environment matrix environment steps name Checkout uses actions checkout v name Configure AWS Credentials uses aws actions configure aws credentials v with aws region eu west role to assume secrets matrix role secret role session name OIDCSession run aws s cp README md s YOUR DEV BUCKET README md shell bash read prod strategy fail fast false matrix environment Development Production role secret READ ROLE DEV DEPLOY ROLE PROD DEPLOY ROLE runs on ubuntu latest environment matrix environment steps name Checkout uses actions checkout v name Configure AWS Credentials uses aws actions configure aws credentials v with aws region eu west role to assume secrets matrix role secret role session name OIDCSession run aws s cp s YOUR PROD BUCKET README md README md shell bash write prod strategy fail fast false matrix environment Development Production role secret READ ROLE DEV DEPLOY ROLE PROD DEPLOY ROLE runs on ubuntu latest environment matrix environment steps name Checkout uses actions checkout v name Configure AWS Credentials uses aws actions configure aws credentials v with aws region eu west role to assume secrets matrix role secret role session name OIDCSession run aws s cp README md s YOUR PROD BUCKET README md shell bashThis workflow will be triggered by both pushes to main and pull requests It will then try to perform reads and writes on both buckets with different combinations of environment and role secret ResultsWith all the workflows pushed to main in your repository create a new branch and create a pull request If everything works correctly the workflow named Pull Request should execute successfully The Bonus workflow should fail for a lot of combinations and it might take some time to fail due to the back off strategy of the configure AWS credentials action If you merge the pull request the workflow named Push should execute successfully Workflow combinationsLooking at the Bonus workflow we should be able to see which combinations of workflow event environment and role that is possible with the OIDC setup we have The working combinations should also be different depending on whether the workflow started from a push to main or a pull request For each of the jobs we should have different combinations These combinations are EnvironmentRoleNONEREAD ROLENONEDEV DEPLOY ROLENONEPROD DEPLOY ROLEDevelopmentREAD ROLEDevelopmentDEV DEPLOY ROLEDevelopmentPROD DEPLOY ROLEProductionREAD ROLEProductionDEV DEPLOY ROLEProductionPROD DEPLOY ROLE Pull requestIf we start by looking at a pull request the workflow should have access to the READ ROLE when the environment is NONE which means that the subject claim will be repo ORG OR USER NAME REPOSITORY pull request This should allow the workflow to read both development and production buckets when using the combination of no environment and READ ROLE It should also be able to use the Development environment since there are no protection rules for that environment This should allow the workflow to both read and write to the development bucket when the environment is Development and role is DEV DEPLOY ROLE Due to the environment protection rule on Production the workflow should not be able to use the Production environment at all when triggered by pull requests read dev job EnvironmentRoleResultNONEREAD ROLESuccessNONEDEV DEPLOY ROLEFail invalid claimNONEPROD DEPLOY ROLEFail invalid claimDevelopmentREAD ROLEFail invalid claimDevelopmentDEV DEPLOY ROLESuccessDevelopmentPROD DEPLOY ROLEFail invalid claimProductionREAD ROLEFail environment protection ruleProductionDEV DEPLOY ROLEFail environment protection ruleProductionPROD DEPLOY ROLEFail environment protection rule write dev job EnvironmentRoleResultNONEREAD ROLEFail insufficient permissionsNONEDEV DEPLOY ROLEFail invalid claimNONEPROD DEPLOY ROLEFail invalid claimDevelopmentREAD ROLEFail invalid claimDevelopmentDEV DEPLOY ROLESuccessDevelopmentPROD DEPLOY ROLEFail invalid claimProductionREAD ROLEFail environment protection ruleProductionDEV DEPLOY ROLEFail environment protection ruleProductionPROD DEPLOY ROLEFail environment protection rule read prod job EnvironmentRoleResultNONEREAD ROLESuccessNONEDEV DEPLOY ROLEFail invalid claimNONEPROD DEPLOY ROLEFail invalid claimDevelopmentREAD ROLEFail invalid claimDevelopmentDEV DEPLOY ROLEFail insufficient permissionsDevelopmentPROD DEPLOY ROLEFail invalid claimProductionREAD ROLEFail environment protection ruleProductionDEV DEPLOY ROLEFail environment protection ruleProductionPROD DEPLOY ROLEFail environment protection rule write prod job EnvironmentRoleResultNONEREAD ROLEFail insufficient permissionsNONEDEV DEPLOY ROLEFail invalid claimNONEPROD DEPLOY ROLEFail invalid claimDevelopmentREAD ROLEFail invalid claimDevelopmentDEV DEPLOY ROLEFail insufficient permissionsDevelopmentPROD DEPLOY ROLEFail invalid claimProductionREAD ROLEFail environment protection ruleProductionDEV DEPLOY ROLEFail environment protection ruleProductionPROD DEPLOY ROLEFail environment protection rule PushContinuing to the workflow when triggered by a push to main we should be able to again use the READ ROLE when the environment is NONE This should give read access to both buckets for that combination For writing to development a combination of Development environment with DEV DEPLOY ROLE is required Finally to deploy to production a combination of Production environment with PROD DEPLOY ROLE is required This should also trigger a manual approval step to continue the deployment read dev job EnvironmentRoleResultNONEREAD ROLESuccessNONEDEV DEPLOY ROLEFail invalid claimNONEPROD DEPLOY ROLEFail invalid claimDevelopmentREAD ROLEFail invalid claimDevelopmentDEV DEPLOY ROLESuccessDevelopmentPROD DEPLOY ROLEFail invalid claimProductionREAD ROLEFail invalid claimProductionDEV DEPLOY ROLEFail invalid claimProductionPROD DEPLOY ROLEFail invalid claim write dev job EnvironmentRoleResultNONEREAD ROLEFail insufficient permissionsNONEDEV DEPLOY ROLEFail invalid claimNONEPROD DEPLOY ROLEFail invalid claimDevelopmentREAD ROLEFail invalid claimDevelopmentDEV DEPLOY ROLESuccessDevelopmentPROD DEPLOY ROLEFail invalid claimProductionREAD ROLEFail invalid claimProductionDEV DEPLOY ROLEFail invalid claimProductionPROD DEPLOY ROLEFail invalid claim read prod job EnvironmentRoleResultNONEREAD ROLESuccessNONEDEV DEPLOY ROLEFail invalid claimNONEPROD DEPLOY ROLEFail invalid claimDevelopmentREAD ROLEFail invalid claimDevelopmentDEV DEPLOY ROLEFail insufficient permissionsDevelopmentPROD DEPLOY ROLEFail invalid claimProductionREAD ROLESuccessProductionDEV DEPLOY ROLEFail invalid claimProductionPROD DEPLOY ROLEFail invalid claim write prod job EnvironmentRoleResultNONEREAD ROLEFail insufficient permissionsNONEDEV DEPLOY ROLEFail invalid claimNONEPROD DEPLOY ROLEFail invalid claimDevelopmentREAD ROLEFail invalid claimDevelopmentDEV DEPLOY ROLEFail insufficient permissionsDevelopmentPROD DEPLOY ROLEFail invalid claimProductionREAD ROLEFail invalid claimProductionDEV DEPLOY ROLEFail invalid claimProductionPROD DEPLOY ROLESuccess ConclusionsStarting from this example you should now be able to use GitHub OIDC as a federated identity in your own GitHub Actions workflows to get rid of long lived credentials once and for all You have learned how to Enable federated identity for GitHub on the AWS sideDefine which repositories and workflow events are allowed to access your rolesKeep your production environments secure by requiring additional stepsHope you learned a thing or two Now go build something awesome GitHub repositoryI have a companion repository available on GitHub for this blog post where you can find the workflows themselves as well as terraform configuration for setting up the AWS side 2022-04-13 09:26:19
海外TECH DEV Community An Introduction to Canary Deployment https://dev.to/komaljprabhakar/an-introduction-to-canary-deployment-333l An Introduction to Canary Deployment Is Continuous Integration enough for testing With Continuous Integration every code from developers workstations gets incorporated into a shared repository or a central repository after which with automated build and tests these codes are checked and are merged What we notice here is that these are synthetic tests in simpler words the scripts on the basis of which these tests happen only emulate user experience They cannot help the IT team understand the actual end user experience This also doesn t give insights on device resources and health state which can also affect the application performance Understanding “Canary of Canary Deployment Before diving into the fact of how Canary Deployment helps us in understanding the actual end user experience let s begin our discussion by learning the significance of its name Well if somewhere your mind is comparing it with “Canary birds then you re on the right track In British Mining History these humble birds were used as “toxic gas detectors While mining for coals if there is any way emission of toxic gasses like Carbon Monoxide Nitrogen dioxide etc these birds alerted the miners about its presence as these birds are more sensitive to airborne toxins as compared to human beings Similarly the DevOps Engineers perform a canary deployment analysis of their code in CI CD pipeline to gauge any possible errors present However here the figurative canaries are a small set of users who will experience all the glitches present in the update Let s define Canary Deployment Canary deployment is a process or technique of controlled rolling of a software update to a small batch of users before making it available to everyone Thereby reducing the chances of widescale faulty user experience When the update is examined and feedback is taken this feedback is again applied and then released on a larger scale Steps Involved in Canary Deployment StrategyTo keep our customers happy and engaged it s important to roll out new updates from time to time Not ignoring the fact that every new change introduced might have an error or two attached to it we need a Canary Release Deployment analysis before releasing it to all our customers With the level of competition in the market any bug left unattended is going to attract customers displeasure and might cause a good loss of the Company s reputation Starting off by understanding the basic structure of the canary deployment strategy We can elucidate it under the following headings CreationAnalysisRoll out Roll back CreationTo begin with we need to create a canary infrastructure where our newest update gets deployed Now we need to direct a small amount of traffic to this newly created canary instance Well the rest would be still continuing with the older version of our software model AnalysisNow it s showtime for our DevOps team Here they need to constantly monitor the performance insights received data collected from network traffic monitors synthetic transaction monitors and all possible resources linked to the canary instance After the very awaited data gets collected the DevOps teams then start comparing it with the baseline version s data Roll out Roll backAfter the analysis is done it s time to think over the results of the comparative data and whether rolling out a new feature is a good decision or is it better to stick back to our baseline state and roll back the update Well then how are we benefitted Zero Production DowntimeYou know it when there s small traffic and the canary instance is not performing as expected you can simply reroute them to your baseline version When the engineers are conducting all sorts of tests at this point they can easily pinpoint the source of error and can effectively fix it or roll back the entire update and prepare for a new one Cost Efficient Friendly with smaller InfrastructureThe goal of Canary Release Deployment analysis is to drive a tiny amount of your customers to the newly created canary instance where the new update is deployed This means you re using a little extra of your infrastructure to facilitate the entire process In addition to that if we compare it with the blue green deployment strategy it requires an entire application hosting environment for deploying the new application As compared to blue green deployment we don t really have to put in our efforts in operating and maintaining the environment in canary deployment it s easier to enable and or disable any particular feature based on any criteria Room for Constant InnovationThe flexibility of testing new features with a small subset of users and being able to receive end user experience immediately is what motivates the dev team to bring in constant improvements updates We can increase the load of the canary instance up to and can keep track of the production stability of the enrolled features Do we have any Limitations Well everything has limitations What s important for us is to understand how to counteract them Time consuming and Prone to ErrorsEnterprises executing canary deployment strategy perform the deployments in a siloed fashion Then a DevOps Engineer is assigned to collect the data and analyze it manually This is quite time consuming as it is not scalable and hinders rapid deployments in CI CD processes There might be some cases where the analysis might go wrong and we might roll back a good update or roll forward a wrong one On Premise Applications are difficult to updateCanary Deployment looks like an appropriate and quite a possible approach when it comes to applications present in Cloud It is something to think about when the applications are installed on personal devices Even then we can have a way around it by setting up an auto update environment for end users Implementations might require some skill We are focusing right now is on the flexibility it offers to test different versions of our application but we should also bring our attention to managing the databases associated with all these instances For performing a proper canary deployment and to be able to compare the old version with the new one we need to modify the schema of the database to support more than one version of the application Thereby allowing the old and new versions to run simultaneously Wrapping up…With an increasing interest of Enterprises to perform canary deployment analysis it is to note that we need to counteract the limitations and make processes smoother We need some good continuous delivery solution providers or Managed Kubernetes orchestrators to automate certain functionalities to keep errors at bay and also integrate security at every stage of development 2022-04-13 09:16:12
海外TECH DEV Community Part 1: Installing and setting up React and Tailwind https://dev.to/entando/part-1-installing-and-setting-up-react-and-tailwind-m4o Part Installing and setting up React and TailwindIn this blog series we ll build a micro frontend using React and Tailwind CSS We ll break the series into two parts This being the first part we ll set up our React project here and install Tailwind step by step In the second part of this blog we will write code to build our stats micro frontend And later we will bundle publish deploy and use it from the Entando Component Repository ECR on a page newly created by us Just in case we don t all know what a micro frontend is here s a little explanation Imagine an UI or a website and what do you see A big frontend right Well this particular frontend application can be split up into several smaller pieces of a frontend we call micro frontends We can deploy and manage each of them independently We can use a variety of libraries and frameworks like React or Angular etc to build these micro frontends on a single page Now the question is how do we do this Before we get started I m assuming you re familiar with what a bundle is In case you re pretty new to this you can check out this blog To begin we refer to this template This is a simple React template that has all the files that we need to bundle and deploy our code All you have to do is clone it on your local machine and open it in your favorite code editor For the next part we need to navigate inside cd ui widgets widgets dir and create our React app Let s name it stats widget We run this command to create our react app npx create react app stats widgetOnce it s created we go inside it with cd stats widget and run npm start to check if the app was successfully created Now before we install Tailwind we need to make sure our project is ready for Entando bundling For that we create a bundle folder inside the stats widget folder and create two files The first one is stats descriptor yaml and the second is stats ftl This descriptor file contains some context about our widget and is also used to point to the ftl file And the ftl file is a FreeMarker template used to render the final HTML code It defines the viewed part while the descriptor defines the definition from a bundle point of view To get going paste this code inside your stats descriptor yaml file code stats widgettitles en Sample Stats Template it Sample Stats Templategroup freecustomUiPath stats ftlAnd paste this code inside the stats ftl file lt assign wp JspTaglibs aps core gt lt entando resource injection point gt lt Don t add anything above this line The build scripts will automatically link the compiled JS and CSS for you and add them above this line so that the widget can be loaded gt lt wp info key currentLang var currentLangVar gt lt stats widget locale currentLangVar gt Cool We are now done setting up our bundle folder But we still need to update the bundle src folder that s present inside the root directory of our project Hence we go back to the root directory and go inside our bundle src folder We open the descriptor yaml file and update the code by replacing the name of our React app and widget descriptor It should look like this code tailwind demodescription Template for Tailwind Componentscomponents widgets ui widgets widgets dir stats widget stats descriptor yamlPerfect now we are a done with setting up all the bundle folders At this point our project structure should look like this Now we can absolutely begin with installing Tailwind on our React app So let s navigate to our React app s directory cd ui widgets widgets dir stats widget Now I have a question Have you ever wondered why we use Tailwind Tailwind is a utility first CSS framework that is packed with many classes which are easy to use in our HTML tags These utility classes are named as per their function so that even a beginner can understand what a particular CSS class defines The best part about Tailwind CSS is that it s highly customizable Plus we don t need to spend hours writing CSS chunks as Tailwind makes them easier Visit the Tailwind CSS website to learn more Let s get started with the installation First we enter the stats widget folder e g cd ui widgets widgets dir stats widget from the root directory We then install Tailwind from our terminal with the next few commands Install Tailwind CSS Post CSS and Autoprefixer npm install D tailwindcss npm tailwindcss postcss compat tailwindcss postcss compat postcss autoprefixer Install CRACO React doesn t allow us to override Post CSS configuration by default but we can use CRACO to configure Tailwind npm install craco cracoCreate a config file for CRACO touch craco config jsAdd the configurations below module exports style postcssOptions plugins require tailwindcss require autoprefixer To tell our app to use CRACO we configure our package json file and replace everything under scripts with the following scripts start craco start build craco build test craco test eject react scripts eject Create the Tailwind configuration file using the full tag to generate all the default configurations npx tailwindcss init fullUsing the full tag is optional It involves a huge configuration you might not want to deal with Please do not forget to replace the existing purge entity under module exports with this purge src js jsx ts tsx public index html Go to the src folder and replace the contents of the existing index css file with the following tailwind base tailwind components tailwind utilities This index css file consists of all the Tailwind base styles Exit the src folder and open the package json file to configure our app to use CRACO to build our styles every time we run our app using npm start or npm build To do this we insert the following syntax under the scripts section of the package json file build style tailwind build src styles index css o src styles tailwind css Import Tailwind CSS base styles to our index js file import index css Delete the app css file and change our app js file to this function App return lt div gt Hi there lt div gt export default App We have completed the Tailwind installation and configuration We can test our React app by generating a page that says “Hi there If it runs then perfect We are all set Attention If we get an error about PostCSS versioning or Autoprefixer versioning we can use the following commands npm uninstall tailwindcssnpm install D tailwindcss latest postcss latest autoprefixer latestYou have now installed Tailwind correctly Well that s all for this blog In the next blog of this series we will do the following Write code to create our stats component Build the React app Wrap our micro frontend inside a custom UI element If you re curious about it you can check out this documentation till the time the blog is Live Prepare our project directory for the ENT cli to bundle it Build Push and Deploy the bundle to the Entando Component Repository ECR Drag and drop the stats widget on a page I hope that s really exciting Meanwhile you re here so I d like to mention we at Entando are building a community to spread awareness of Composable and Modular applications There is a lot more we are trying to do with our community If you feel like engaging or contributing to our community please join our Discord Server and let s learn together See you in the next blog Thank you 2022-04-13 09:14:56
海外TECH DEV Community Append "botinfo" command for look deploy, server and user information in Telegram bot on NestJS https://dev.to/endykaufman/append-botinfo-command-for-look-deploy-server-and-user-information-in-telegram-bot-on-nestjs-3242 Append quot botinfo quot command for look deploy server and user information in Telegram bot on NestJS Links source code of bot current bot in telegram Update files Update configlibs core server src lib bot commands bot commands config bot commands config tsexport const BOT COMMANDS CONFIG Symbol BOT COMMANDS CONFIG export interface BotCommandsConfig admins string version string commit string date string maxRecursiveDepth number prepareCommandString command string gt string Add servicelibs core server src lib bot commands bot commands services bot commands botinfo service tsimport Inject Injectable from nestjs common import BotCommandsConfig BOT COMMANDS CONFIG from bot commands config bot commands config import BotCommandsProviderActionResultType from bot commands types bot commands provider action result type import BotCommandsProvider BotCommandsProviderActionMsg from bot commands types bot commands provider interface import BotCommandsToolsService from bot commands tools service Injectable export class BotCommandsBotinfoService implements BotCommandsProvider constructor Inject BOT COMMANDS CONFIG private readonly botCommandsConfig BotCommandsConfig private readonly botCommandsToolsService BotCommandsToolsService async onHelp return null async onMessage lt TMsg extends BotCommandsProviderActionMsg BotCommandsProviderActionMsg gt msg TMsg Promise lt BotCommandsProviderActionResultType lt TMsg gt gt if this botCommandsToolsService checkCommands msg text botinfo en const formatMemoryUsage data gt Math round data MB const memoryData process memoryUsage const markdown this botCommandsToolsService isAdmin msg Server RSS formatMemoryUsage memoryData rss Heap total formatMemoryUsage memoryData heapTotal Heap used formatMemoryUsage memoryData heapUsed V external formatMemoryUsage memoryData external n Bot Version this botCommandsConfig version unknown Date this botCommandsConfig date unknown Commit this botCommandsConfig commit unknown n Chat ID msg chat id msg from id unknown join n split join split join return type markdown message msg markdown markdown return null Update modulelibs core server src lib bot commands bot commands module tsimport DynamicModule Module from nestjs common import CustomInjectorModule from nestjs custom injector import TranslatesModule from nestjs translates import BotCommandsConfig BOT COMMANDS CONFIG from bot commands config bot commands config import BotCommandsBotinfoService from bot commands services bot commands botinfo service import BotCommandsToolsService from bot commands services bot commands tools service import BotCommandsService from bot commands services bot commands service import BOT COMMANDS PROVIDER from bot commands types bot commands provider interface Module imports CustomInjectorModule TranslatesModule providers BotCommandsToolsService BotCommandsService exports CustomInjectorModule TranslatesModule BotCommandsToolsService BotCommandsService export class BotCommandsModule static forRoot config BotCommandsConfig DynamicModule return module BotCommandsModule providers provide BOT COMMANDS CONFIG useValue lt BotCommandsConfig gt maxRecursiveDepth config provide BOT COMMANDS PROVIDER useClass BotCommandsBotinfoService exports BOT COMMANDS CONFIG Update app moduleapps server src app app module tsimport BotInGroupsModule from kaufman bot bot in groups server import BotCommandsModule PrismaClientModule from kaufman bot core server import CurrencyConverterModule from kaufman bot currency converter server import DebugMessagesModule from kaufman bot debug messages server import DialogflowModule from kaufman bot dialogflow server import FactsGeneratorModule from kaufman bot facts generator server import FirstMeetingModule from kaufman bot first meeting server import JokesGeneratorModule from kaufman bot jokes generator server import DEFAULT LANGUAGE LanguageSwitherModule from kaufman bot language swither server import QuotesGeneratorModule from kaufman bot quotes generator server import ShortCommandsModule from kaufman bot short commands server import Module from nestjs common import env from env var import TelegrafModule from nestjs telegraf import getDefaultTranslatesModuleOptions TranslatesModule from nestjs translates import join from path import AppController from app controller import AppService from app service const TELEGRAM BOT WEB HOOKS DOMAIN env get TELEGRAM BOT WEB HOOKS DOMAIN asString const TELEGRAM BOT WEB HOOKS PATH env get TELEGRAM BOT WEB HOOKS PATH asString Module imports TelegrafModule forRoot token env get TELEGRAM BOT TOKEN required asString launchOptions dropPendingUpdates true TELEGRAM BOT WEB HOOKS DOMAIN amp amp TELEGRAM BOT WEB HOOKS PATH webhook domain TELEGRAM BOT WEB HOOKS DOMAIN hookPath TELEGRAM BOT WEB HOOKS PATH PrismaClientModule forRoot databaseUrl env get SERVER POSTGRES URL required asString logging long queries maxQueryExecutionTime TranslatesModule forRoot getDefaultTranslatesModuleOptions localePaths join dirname assets in join dirname assets in getText join dirname assets in class validator messages vendorLocalePaths join dirname assets in locales DEFAULT LANGUAGE ru DebugMessagesModule forRoot BotCommandsModule forRoot admins env get TELEGRAM BOT ADMINS default asArray prepareCommandString command string gt command split ё join е commit env get DEPLOY COMMIT default asString date env get DEPLOY DATE default asString version env get DEPLOY VERSION default asString ShortCommandsModule forRoot commands en joke get jokes quote thought wisdom get quotes fact history get facts forgot me meet reset what you can do faq help disable debug debug off enable debug debug on ru joke шутка шутку шутки пошути шути рассмеши смешинки смешинка get jokes quote thought wisdom цитата дайцитату цитируй мысль мудрость залечи get quotes fact history история историю факты get facts forgot me забудьменя meet reset what you can do faq чтотыумеешь справка help disable debug выключидебаг debug off enable debug включидебаг debug on BotInGroupsModule forRoot botNames en Endy Kaufman ru Энди Endy Kaufman Енди Кауфман botMeetingInformation en Hello I m Endy Hello Hello ru Всемпривет яЭнди Всемпривет Всемпривет LanguageSwitherModule forRoot CurrencyConverterModule forRoot FactsGeneratorModule forRoot QuotesGeneratorModule forRoot JokesGeneratorModule forRoot FirstMeetingModule forRoot botName en Endy ru Энди DialogflowModule forRoot projectId env get DIALOGFLOW PROJECT ID required asString controllers AppController providers AppService export class AppModule Update deploy script github workflows develop deploy ymlname deploy yamllint disable line rule truthyon push branches feature jobs migrate runs on self hosted develop vps environment dev steps name Cloning repo uses actions checkout v with fetch depth name Apply migrations run curl o https raw githubusercontent com nvm sh nvm v install sh bash nvm nvm sh nvm version nvm install v nvm use v npm i force export POSTGRES HOST dokku postgres info global postgres internal ip export ROOT POSTGRES URL postgres postgres secrets ROOT POSTGRES PASSWORD POSTGRES HOST postgres schema public export SERVER POSTGRES URL secrets SERVER POSTGRES URL npm run rucken postgres export DATABASE URL SERVER POSTGRES URL amp amp npm run migrate export DEPLOY DATE date Y m d H M S export DEPLOY COMMIT GITHUB SHA export DEPLOY VERSION node pe require package json version dokku config set no restart kaufman bot SERVER POSTGRES URL SERVER POSTGRES URL dokku config set no restart global POSTGRES HOST global postgres dokku config set no restart kaufman bot GOOGLE APPLICATION CREDENTIALS google credentials json dokku config set no restart kaufman bot GOOGLE CREDENTIALS secrets GOOGLE CREDENTIALS dokku config set no restart kaufman bot DIALOGFLOW PROJECT ID secrets DIALOGFLOW PROJECT ID dokku config set no restart kaufman bot TELEGRAM BOT WEB HOOKS DOMAIN secrets TELEGRAM BOT WEB HOOKS DOMAIN dokku config set no restart kaufman bot TELEGRAM BOT WEB HOOKS PATH secrets TELEGRAM BOT WEB HOOKS PATH dokku config set no restart kaufman bot DEPLOY DATE DEPLOY DATE dokku config set no restart kaufman bot DEPLOY COMMIT DEPLOY COMMIT dokku config set no restart kaufman bot DEPLOY VERSION DEPLOY VERSION deploy needs migrate runs on ubuntu latest environment dev steps name Cloning repo uses actions checkout v with fetch depth name Push to dokku uses dokku github action master with branch feature git remote url ssh dokku secrets HOST kaufman bot ssh private key secrets SSH PRIVATE KEY Check from telegram Check from user chat Check from group chat 2022-04-13 09:12:03
海外TECH DEV Community Complete guide to deploying SSR Vite apps on AWS with automation https://dev.to/canopassoftware/complete-guide-to-deploying-ssr-vite-apps-on-aws-with-automation-39gc Complete guide to deploying SSR Vite apps on AWS with automationWant to learn how to deploy the Server side rendering SSR Vite app on AWS In this blog we will see how to deploy the SSR Vite app on AWS and automate the deployment process Though this article refers to only the Vite app it works for almost all SSR apps This blog will answer many of your questions Like What is SSR How does the SSR app deployment work How do you implement SSR for SEO and loading time optimizations You will learn how to deploy the SSR app on AWS using the following services ECS Elastic container service ECR Elastic container registry ELB Elastic load balancer CloudformationGithub actions to automate deploymentWe will create an AWS cloudformation stack that contains all required resources to deploy the app and it will be automated by Github actions You can read full article here 2022-04-13 09:09:46
海外TECH DEV Community Is Cloud Expensive? https://dev.to/aws-builders/is-cloud-expensive-1l1 Is Cloud Expensive The Cloud Computing is one of the latest trends in Software industry Well yes it s not so latest as AWS was launched in mid s and since then has become a monster of its own There was a famous iPhone commercial There s an app for that And now we can say that for any use case there s a Service for that AWS is the Cloud services provider which has the majority market cap in this space Still it is small compared with the whole software industry There are millions of applications of every kind which are not using any Cloud services Specially in developing and under developed countries the hesitation to adopt cloud is because of it s cost These markets being so cost sensitive don t consider the total cost of ownership concept heavily marketed by cloud vendors of all types I ve tried to remove this misconception and came up with reasons of my own on why cloud is cheap Utility PricingSame like we have billing for our house utilities cloud services are charged on usage If you don t use any services then don t have to pay anything Free TierWhen we sign up for a new AWS account it comes with lot of free stuff Some of these free tier services are for months and some others are forever free again based on usage Get the details of all such services here Student Program AWS EducationIf you re student then access AWS limited services for free via AWS Educate program One must have active edu email address I received in credits when I was a student and learned a lot with those Startup Program AWS ActivateAWS specially love startups not only they might be next Netflix app but it s all free promotion and adoption of AWS A startup can get up to in AWS credits More details here Reserve InstancesIf the application is compute intensive or just require stable EC instances then Reserving instances for or years is a good idea to save some bucks It is up to cheaper then on demand utility price Savings PlanSavings plan is similar to Reserved Instances where the purchase is not based on server per year but per hour Spot InstancesThe magic behind the AWS on demand capacity is Redundancy But it also means that resource utilization is not optimal most of the time This extra capacity is available to the consumers via SPOT Instances AWS LightsailNot every application is using Microservices Architecture In fact majority of the web is working on PHP language and is just WordPress To setup such server should not take few clicks and few minutes That s what LightSail does It has lot of templates to create servers with a fixed pricing model LightSail is perfect entry point to the cloud for lot of users Economy of ScaleThis point applies for someone who is thinking long term As your cloud usage grows the cost doesn t spikes up linearly It goes down actually The higher the AWS bill the more discount you ll get Commitment FreeThis is my favorite point and consideration with the cloud There are just no commitments at all and anyone can mix and match any services with any provider One can stop any service and not have to worry about the bills from that point onwards YouTube VideoI also happen to make a video on the topic in case if anyone wants to hear all of my love for AWS If you ve made it so far then here s the slides I used in the video for you 2022-04-13 09:03:01
海外TECH DEV Community What’s new in React 18? https://dev.to/rishikeshvedpathak/whats-new-in-react-18-5np What s new in React What s new in React The new version of React is out and it is now available on npm It has introduced some new interesting features You won t have to change any code or learn a new concept and very importantly it does not break any of your existing code How To UpdateIt is fairly simple to update to the latest version of React in your existing project Follow the below steps to update to React Update dependencies for npm npm install react react dom or for yarn yarn add react react dom Go to your root index js and make the below changes Before import render from react dom const container document getElementById app render lt App gt container After import createRoot from react dom client const container document getElementById app const root createRoot container root render lt App gt The React createRoot creates a root running in React which adds all of the improvements of React and allows you to use concurrent features This will be the root API moving forward That s all You don t need to make any other code changes ConcurrencyThis is the most important concept added in React Concurrency is not a feature it is an implementation detail It helps with state update prioritization so that urgent state updates can be prioritized over less urgent time consuming blocking updates A key property of Concurrent React is that rendering is interruptible React always process all state updates in the order they were triggered i e in a single uninterrupted synchronous transaction With the addition of this concurrency concept you can tell react that a certain state update has a lower priority than the others and then react will treat other state updates with higher priority You can tell react that a state has a lower priority using one of the new APIs that are newly introduced in React ーuseTransition and startTransition useTransition and startTransitionuseTransition and startTransition let you mark some state updates as not urgent Other state updates are considered urgent by default e g urgent state updates ーupdating a text inputnon urgent state updates ーrendering a list of search results useTransitionSince it is a hook this can be used in functional components It returns an array with two values isPending a stateful value for the pending state of the transitionstartTransition a function to start the transition startTransitionThis method is designed to be used when useTransition is not available e g class components This lets you mark updates inside the provided callback as transitions See the working examples below We have a list of two thousand users to be rendered on UI In the first example we have used the traditional way of rendering the list whereas in the second example we have used the useTransition hook You will notice the performance difference while searching for a user In the first example you will notice a bit of lag while typing in the input box this is because React is waiting for a filtered list state update and then updates the state of the input box And in the second example we are telling React to keep the list update state on low priority which results in performance improvement Without prioritization With concurrency rendering Important Note It is not recommended to wrap every state update with startTransition instead we should use it only when there is no other solution available to increase the UI performance useDefferedValues It tells React to show the old state until a newly updated state is ready This is similar to statrTransition but can be used where you don t have complete control over the state updates e g state passed from parent component to a child component To use this you simply need to wrap the state variable inside useDeffedValue and then your child component will get the old state until an updated value is available const userList useDeferredValue props list New Hooks for LibrariesReact has introduced a few hooks useSyncExternalStoreuseInsertionEffectuseIdNote These hooks are intended to be used by libraries not application code As an application developer you will probably ever use these hooks Automatic BatchingAutomatic batching allows grouping multiple state updates together so that they get executed as one state update This results in a single re render for better performance Batching was already present in the older version of React but was limited to React event handlers only For example if you have two state updates inside of the same click event React has always batched these into one re render If you run the following code you ll see that every time you click React only performs a single render although you set the state twice With automatic batching React now support state update batching inside promises setTimeout native event handlers or any other event that was not batched in React by default See the below example What if I don t want to batch Usually batching is safe but some code may depend on reading something from the DOM immediately after a state change For those use cases you can use ReactDOM flushSync to opt out of batching import flushSync from react dom Note react dom not react function handleClick flushSync gt setCounter c gt c React has updated the DOM by now flushSync gt setFlag f gt f React has updated the DOM by now New Suspense FeaturesThe suspense component is already present in the older version of React However the only supported use case was code splitting using React lazy and it wasn t supported at all when rendering on the server You must have been using the Suspense component to show a fallback component until another component is ready for rendering i e lazy loading of components lt Suspense fallback lt Spinner gt gt lt LaziComponent gt lt Suspense gt React allows Suspense work for server side rendering and in case of data fetching in near future Server ComponentsReact introduced Server Components however these are still in the research and development phase These components are meant to render on the server only allowing to perform certain server side only logic This will be used to perform tasks on the server that should not run on the client may be for security or performance reasons This is an experimental feature and not ready for you to use in your app We are expecting this to be released in near future More details are available here 2022-04-13 09:00:57
海外TECH Engadget Clubhouse's latest experiment is in-room games https://www.engadget.com/clubhouse-experimental-gaming-feature-095117760.html?src=rss Clubhouse x s latest experiment is in room gamesClubhouse has started testing an in room gaming feature the company has confirmed to TechCrunch and its debut game could help users get to know each other better The feature s launch title called quot Wild Cards quot presents users a series of icebreaker questions and challenges It could ask you to quot pitch an idea for a movie in seconds quot for instance or to quot share five things from your search history quot To play you ll have to choose the quot Games quot option under Rooms It ll open a social room for you where you can invite friends to play nbsp As TechCrunch notes launching a feature not offered by other audio services such as Twitter Spaces could be part of Clubhouse s efforts to get new users to stay The audio chat app shot up in popularity in the early days of the pandemic when full lockdowns were implemented and people were looking to connect with friends and strangers in new ways Its success led to the development and launch of other audio products by more well known companies including Facebook s audio rooms Spotify s Greenroom that s been rebranded as Spotify Live and Amazon s Amp nbsp Clubhouse s popularity and download numbers took a hit when those rival services emerged especially after pandemic restrictions started lifting A lot of users chose to move to more established social networks offering similar audio services because they already have an existing network on those platforms Still Clubhouse told The New York Times back in December that it s still growing and that the company is confident it s not just a pandemic fad Over the past few months the service has released a handful of new and experimental features in an effort to get users to stay One of them is the chat function in voice rooms for those who d rather text and the other is support for web listening in the US 2022-04-13 09:51:17
医療系 医療介護 CBnews 仮名化データ創薬研究開発に活用、法整備に期待も-検討会で製薬協などにヒアリング https://www.cbnews.jp/news/entry/20220413175500 名誉教授 2022-04-13 18:05:00
金融 ニュース - 保険市場TIMES 東京海上日動、スマートシティ・インスティテュートと協業を開始 https://www.hokende.com/news/blog/entry/2022/04/13/190000 東京海上日動、スマートシティ・インスティテュートと協業を開始「LiveableWellBeingCity指標」一般社団法人スマートシティ・インスティテュート以下、SCIJapanと東京海上日動火災保険株式会社以下、東京海上日動は年月日、「LiveableWellBeingCity指標」以下、LWC指標を用いた、自治体向け都市分析モデルの活用促進において協業を開始すると発表した。 2022-04-13 19:00:00
ニュース @日本経済新聞 電子版 JR東日本と京王、新宿に超高層ビル 28年度開業 https://t.co/AFR4b8RyDZ https://twitter.com/nikkei/statuses/1514177881257046018 超高層ビル 2022-04-13 09:45:13
ニュース @日本経済新聞 電子版 日揮と川崎汽船が中古タンカーをLNG洋上プラントに再利用。建設費が3割安く、採掘ガスを陸まで運ぶパイプラインも不要に。アフリカ沿岸などに生産地が広がる可能性があります。 #日経イブニングスクープ https://t.co/Ie3uK0MYgu https://twitter.com/nikkei/statuses/1514174112507305984 2022-04-13 09:30:15
ニュース @日本経済新聞 電子版 新潟・長岡花火、3年ぶり開催へ 「平和の祈りを」  https://t.co/sTtsJOOffN https://twitter.com/nikkei/statuses/1514173379523145734 開催 2022-04-13 09:27:20
ニュース @日本経済新聞 電子版 国内ワクチン接種、3回完了は5880万人 #日経_チャートで見る日本の接種状況 #日経ビジュアルデータ https://t.co/BQrxqvQZLS https://twitter.com/nikkei/statuses/1514170280989630464 状況 2022-04-13 09:15:01
ニュース @日本経済新聞 電子版 SMBC日興事件、捜査終結 事件で何が問われたのか https://t.co/sDvKfwejmO https://twitter.com/nikkei/statuses/1514167197874405379 終結 2022-04-13 09:02:46
ニュース @日本経済新聞 電子版 インドへ武器・NZ利上げ・アマゾンの見守りドアベル https://t.co/8CTDAwsSHi https://twitter.com/nikkei/statuses/1514166933004128271 武器 2022-04-13 09:01:43
ニュース @日本経済新聞 電子版 日揮と川崎汽船、中古タンカーでLNG洋上生産基地 【日経イブニングスクープ】 https://t.co/cZ0JisCeYw https://twitter.com/nikkei/statuses/1514166930705313796 川崎汽船 2022-04-13 09:01:42
海外ニュース Japan Times latest articles Japan looks to avoid another quasi-emergency as seventh COVID wave looms https://www.japantimes.co.jp/news/2022/04/13/national/covid-seventh-wave-japan/ Japan looks to avoid another quasi emergency as seventh COVID wave loomsAhead of the Golden Week holiday period the government is concerned that if Okinawa is hit with a fresh infection wave the virus will spread 2022-04-13 18:28:09
ニュース BBC News - Home Boris Johnson and Rishi Sunak reject calls to resign over lockdown fines https://www.bbc.co.uk/news/uk-politics-61083402?at_medium=RSS&at_campaign=KARANGA apologise 2022-04-13 09:46:24
ニュース BBC News - Home P&O ferry detained at Dover over deficiencies https://www.bbc.co.uk/news/business-61086897?at_medium=RSS&at_campaign=KARANGA disruption 2022-04-13 09:05:37
ニュース BBC News - Home UK officials investigating total of 74 child hepatitis cases https://www.bbc.co.uk/news/health-61085870?at_medium=RSS&at_campaign=KARANGA covid 2022-04-13 09:45:13
ニュース BBC News - Home Online porn: 'My pupils ask me about violence' https://www.bbc.co.uk/news/education-61045514?at_medium=RSS&at_campaign=KARANGA pornography 2022-04-13 09:07:05
ビジネス 不景気.com SFPの22年2月期は79億円の営業赤字へ、コロナ長期化で - 不景気.com https://www.fukeiki.com/2022/04/sfp-hd-2022-loss3.html 磯丸水産 2022-04-13 09:06:02
北海道 北海道新聞 公設民営塾地元で進学率向上 「足寄町学習塾」高校存続に一役 運営会社、道内外に事業拡大へ https://www.hokkaido-np.co.jp/article/669088/ 事業拡大 2022-04-13 18:30:00
北海道 北海道新聞 パーティー参加の英首相、コロナ規制違反で罰金処分 辞任否定、退陣論が再燃 https://www.hokkaido-np.co.jp/article/669083/ 内本智子 2022-04-13 18:25:00
北海道 北海道新聞 陸自トップも反戦デモ例示 外部講演、副大臣が陳謝 https://www.hokkaido-np.co.jp/article/669084/ 陸上自衛隊 2022-04-13 18:25:00
北海道 北海道新聞 英国の物価、7・0%上昇 3月、30年ぶり高水準 https://www.hokkaido-np.co.jp/article/669082/ 消費者物価指数 2022-04-13 18:24:00
北海道 北海道新聞 音別に交流拠点「おんぽーと」 24日オープン 和紙工房設置、特産品販売も https://www.hokkaido-np.co.jp/article/669081/ 音別 2022-04-13 18:22:00
北海道 北海道新聞 香港、前政務官が立候補 行政長官選、事実上当選確定 https://www.hokkaido-np.co.jp/article/669080/ 当選確定 2022-04-13 18:21:00
北海道 北海道新聞 中標津高の李家教諭に文科大臣表彰 中標津高・伊藤教諭、中標津農高・加瀬教諭には道実践表彰 https://www.hokkaido-np.co.jp/article/669048/ 高校 2022-04-13 18:19:55
北海道 北海道新聞 実況ユーチューバー再逮捕 「ねこくん!」大麻所持か https://www.hokkaido-np.co.jp/article/669079/ 大麻取締法 2022-04-13 18:19:00
北海道 北海道新聞 Jリーグ前チェアマンが企業支援 村井満氏が投資ファンド設立 https://www.hokkaido-np.co.jp/article/669078/ 企業支援 2022-04-13 18:18:00
北海道 北海道新聞 大阪で5121人感染 2日連続で5千人超え https://www.hokkaido-np.co.jp/article/669076/ 新型コロナウイルス 2022-04-13 18:17:00
北海道 北海道新聞 ウクライナ脱出邦人、永住帰国意向「きょうだいのそばで暮らす」 https://www.hokkaido-np.co.jp/article/669074/ 永住帰国 2022-04-13 18:16:00
北海道 北海道新聞 トヨタ25・9万台リコール エンジン破損の恐れ https://www.hokkaido-np.co.jp/article/669070/ 防水性 2022-04-13 18:12:00
北海道 北海道新聞 HIS、ハワイツアー5月再開 ANA、日航系列の旅行会社も https://www.hokkaido-np.co.jp/article/669068/ 新型コロナウイルス 2022-04-13 18:03:00
北海道 北海道新聞 国公立病院の収益が急改善 20年度、コロナ巨額補助金で https://www.hokkaido-np.co.jp/article/669067/ 医療従事者 2022-04-13 18:02:00
北海道 北海道新聞 新潟・長岡花火、3年ぶり開催へ コロナ禍で過去2年中止 https://www.hokkaido-np.co.jp/article/669066/ 新型コロナウイルス 2022-04-13 18:01:00
IT 週刊アスキー 『スーパーロボット大戦30』に新たな機体&ストーリーを追加する無料アップデート&「エキスパンションパック」が配信決定! https://weekly.ascii.jp/elem/000/004/089/4089191/ 無料アップデート 2022-04-13 18:55:00
海外TECH reddit Meta、メタバースでの販売に47.5%の手数料を設定…バーチャル商品販売ツールのテストを開始 https://www.reddit.com/r/newsokunomoral/comments/u2mc9w/metaメタバースでの販売に475の手数料を設定バーチャル商品販売ツールのテストを開始/ ewsokunomorallinkcomments 2022-04-13 09:24:53

コメント

このブログの人気の投稿

投稿時間:2021-06-17 22:08:45 RSSフィード2021-06-17 22:00 分まとめ(2089件)

投稿時間:2021-06-20 02:06:12 RSSフィード2021-06-20 02:00 分まとめ(3871件)

投稿時間:2021-06-17 05:05:34 RSSフィード2021-06-17 05:00 分まとめ(1274件)