IT |
気になる、記になる… |
Google、「Google マップ」のデスクトップ版で新しいボトムバーをテスト中 |
https://taisy0.com/2021/12/21/149962.html
|
androidpolice |
2021-12-20 16:16:23 |
AWS |
AWS Big Data Blog |
Query cross-account AWS Glue Data Catalogs using Amazon Athena |
https://aws.amazon.com/blogs/big-data/query-cross-account-aws-glue-data-catalogs-using-amazon-athena/
|
Query cross account AWS Glue Data Catalogs using Amazon AthenaMany AWS customers rely on a multi account strategy to scale their organization and better manage their data lake across different projects or lines of business The AWS Glue Data Catalog contains references to data used as sources and targets of your extract transform and load ETL jobs in AWS Glue Using a centralized Data Catalog … |
2021-12-20 16:58:43 |
python |
Pythonタグが付けられた新着投稿 - Qiita |
呼称の違う連想配列系の単語でググってみた |
https://qiita.com/j5c8k6m8/items/4717884ab4390089a695
|
ただ、今回の執筆を通して、google検索結果を改めて考えてみると、同一ドメイン記事の結果が、上位に並ばないようになっていたのもあり、昔はQiitaの記事が、TOPのほとんどを埋め尽くしていたことも、あった気がした良記事が上位に出てくるとは限らないため、特に基礎的な部分において、初学者がgoogle頼みだと、昔よりも特に遠回りではないかなという気持ちが強くなりました。 |
2021-12-21 01:49:15 |
python |
Pythonタグが付けられた新着投稿 - Qiita |
ABCの類似度の高い問題を推薦する拡張機能 |
https://qiita.com/dakkenkd425/items/366a81e5f34e1263627f
|
そのような文字が問題文に登場するかどうかで問題文の類似度を計算してほしくないので、固有名詞と同様にこのような記号も取り除きます。 |
2021-12-21 01:30:38 |
python |
Pythonタグが付けられた新着投稿 - Qiita |
浮動小数点形式の演算結果として +0 と -0 が出てくる規則 |
https://qiita.com/ikiuo/items/d3e5db14cd3c5850a115
|
逆に、「」と「」が一致しないと判断する比較命令はなくても大して困りません。 |
2021-12-21 01:07:51 |
js |
JavaScriptタグが付けられた新着投稿 - Qiita |
呼称の違う連想配列系の単語でググってみた |
https://qiita.com/j5c8k6m8/items/4717884ab4390089a695
|
ただ、今回の執筆を通して、google検索結果を改めて考えてみると、同一ドメイン記事の結果が、上位に並ばないようになっていたのもあり、昔はQiitaの記事が、TOPのほとんどを埋め尽くしていたことも、あった気がした良記事が上位に出てくるとは限らないため、特に基礎的な部分において、初学者がgoogle頼みだと、昔よりも特に遠回りではないかなという気持ちが強くなりました。 |
2021-12-21 01:49:15 |
js |
JavaScriptタグが付けられた新着投稿 - Qiita |
浮動小数点形式の演算結果として +0 と -0 が出てくる規則 |
https://qiita.com/ikiuo/items/d3e5db14cd3c5850a115
|
逆に、「」と「」が一致しないと判断する比較命令はなくても大して困りません。 |
2021-12-21 01:07:51 |
js |
JavaScriptタグが付けられた新着投稿 - Qiita |
React と ASP.NET Core (C#) でホロジュール Web アプリをつくってみた |
https://qiita.com/kerobot/items/fda428b9b8e13ef4e8a9
|
reactlifecyclemethodsdiagramWebAPIを呼び出す処理では、取得した結果をコンポーネントの描画に利用するために、状態を更新setStateしていたのですが、その結果、下記のように無限ループとなりスタックオーバーフローが発生してしまったようです。 |
2021-12-21 01:02:02 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
C# ClosedXMLを使用した「画像貼り付けエクセルファイル生成」プログラムで例外発生 |
https://teratail.com/questions/374801?rss=all
|
CClosedXMLを使用した「画像貼り付けエクセルファイル生成」プログラムで例外発生前提・実現したいことVisualStudioでClosedXMLを使用し「画像を貼り付けたエクセルファイル」を生成する際、例外が発生します。 |
2021-12-21 01:45:37 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
ボタンを押すことで問題の切り替えを行いたい |
https://teratail.com/questions/374800?rss=all
|
ボタンを押すことで問題の切り替えを行いたいJavaScripで以下のコードのように英語の問題を作っており、今現在はページに問表示しているのですが以下のリンクのように次へというボタンを押すことで問題を問ずつ切り替わるようにしたいです。 |
2021-12-21 01:28:50 |
Docker |
dockerタグが付けられた新着投稿 - Qiita |
CircleCI × docker-compose をRailsアプリに導入してみた |
https://qiita.com/i-f/items/0859ac0a1cbc55321d8b
|
CircleCI×dockercomposeをRailsアプリに導入してみたRailsアプリにCircleCIとdockercomposeを導入してみました。 |
2021-12-21 01:04:15 |
Linux |
CentOSタグが付けられた新着投稿 - Qiita |
CloudFlare&Let's Encryptで証明書がrenewできなかった |
https://qiita.com/aokabin/items/5b0a3990177c0bb643d7
|
StartingnewHTTPSconnectionapicloudflarecomFailedtorenewcertificatefoobarwitherrorServerErrorどうやら、foobarにアクセスした際に、timeoutしているみたいTroubleshootingCloudflareXXerrorsCloudflareHelpCenter問題を発見するまでに調べていた内容でcertbotは証明書発行の際、サービスのURLにアクセスする際にhttpsではなくhttpでアクセスしてくるらしくhttp→httpsのリダイレクトを設定しているとうまく証明書更新ができないらしいよ、という記事を見かけていたので、それかなと推測推測したはいいものの、リダイレクトの設定なんかWebサーバーのconfでやってなかったApacheということは、CloufFlare側の設定かなと思い、コンソールにログインしてみると、SSLの設定項目の中に以下があった元々「フル」が選択されており、オリジンサーバーに証明書月リクエストを行なっていたようだったので、今回は「フレキシブル」に変更。 |
2021-12-21 01:31:17 |
Ruby |
Railsタグが付けられた新着投稿 - Qiita |
CircleCI × docker-compose をRailsアプリに導入してみた |
https://qiita.com/i-f/items/0859ac0a1cbc55321d8b
|
CircleCI×dockercomposeをRailsアプリに導入してみたRailsアプリにCircleCIとdockercomposeを導入してみました。 |
2021-12-21 01:04:15 |
海外TECH |
Ars Technica |
Holmes’ fate hangs in the balance as jury deliberates criminal fraud charges |
https://arstechnica.com/?p=1821867
|
arguments |
2021-12-20 16:09:45 |
海外TECH |
MakeUseOf |
The 10 Best Raspberry Pi IoT Projects |
https://www.makeuseof.com/best-raspberry-pi-iot-projects/
|
internet |
2021-12-20 16:30:25 |
海外TECH |
DEV Community |
Implementing Firebase Storage to store files in your web app. |
https://dev.to/durgeshkr/implementing-firebase-storage-to-store-files-in-your-web-app-2hkk
|
Implementing Firebase Storage to store files in your web app Hello yall this is going to me first blog so lets get started I ll be implementing firebase storage in a react web app but the concept is the same in any technology including vanilla javascript Vue Js AngularJs our good old jQuery and others Intro to Firebase StorageAccording to official docs of firebase Cloud Storage for Firebase lets you upload and share user generated content such as images and video which allows you to build rich media content into your apps Your data is stored in a Google Cloud Storage bucket ーan exabyte scale object storage solution with high availability and global redundancy In simple words If you want to incorporate file upload or image upload in your web app MERN app in our example one method is to upload the file or image to Firebase s cloud storage and then collect the URL of the uploaded image file and put that URL in your database The user interface that we have Final CodeHere is the final completed code that we will have at the end of this blog I am definitely gonna explain each and every line of code to you Logic for File Upload import uploadBytesResumable getDownloadURL ref deleteObject from firebase storage import storage from Firebase function uploadTaskPromise file return new Promise function resolve reject if file return const storageRef ref storage img file name const uploadTask uploadBytesResumable storageRef file uploadTask on state changed snapshot gt const prog Math round snapshot bytesTransferred snapshot totalBytes console log prog error gt console log ERRRRR alert Error inside upload file function error reject gt getDownloadURL uploadTask snapshot ref then downloadURL gt console log File available at downloadURL resolve downloadURL Logic for Post request to backend const postSubmitHandler async e gt e preventDefault try storageURL await uploadTaskPromise file console log PROmisE REsOLVEd ANd got the urL AS storageURL const response await fetch http localhost api posts method POST headers Content Type application json Authorization Bearer token body JSON stringify location description price contact breed creatorId currentUser id image storageURL console log REQUEST SENT catch err const deleteImgRef ref storage img file name deleteObject deleteImgRef then gt console log Something Went wrong Image deleted successfully catch error gt console log an error occured while deleting image error Creating Firebase Config fileIf you re already using npm and a module bundler such as webpack or Rollup you can run the following command to install the latest SDK npm install firebaseThen initialize Firebase and begin using the SDKs for the web app In our case we will only be using the storage SDK Before we move to the next step I am imagining that you already have created your project in Firebase To initialize Firebase first create a new file and name it as firebase js Now copy the code bellow and paste it in the firebase js file that you just created import initializeApp from firebase app import getStorage from firebase storage const firebaseConfig PASTE YOUR FIREBASE CONFIG HERE const app initializeApp firebaseConfig const storage getStorage app export storage app Now go Project settings by clicking of the cog wheel icon in the left menu of your firebase project Then scroll down and just copy what ever you have in your firebaseConfig constant and paste it inside the firebaseConfig constant that we created in our firebase js file Now we can use the Firebase storage SDK in our app wherever we want Creating the upload file functionNow we go to the the file where we are creating the post import uploadBytesResumable ref from firebase storage import storage from Firebase function uploadTaskPromise file if file return const storageRef ref storage img file name const uploadTask uploadBytesResumable storageRef file We import the required dependencies Create a new function and name it as uploadTaskPromise which accepts a file as a parameter Now In order to upload or download files delete files or get or update metadata you must create a reference to the file you want to operate on A reference can be thought of as a pointer to a file in the cloud References are lightweight so you can create as many as you need and they are also reusable for multiple operations Once the reference is created we then create an uploadTask by passing our reference and the file that we want to upload as parameters to uploadBytesResumable Uploading the fileimport uploadBytesResumable getDownloadURL ref deleteObject from firebase storage import storage from Firebase function uploadTaskPromise file return new Promise function resolve reject if file return const storageRef ref storage img file name const uploadTask uploadBytesResumable storageRef file uploadTask on state changed snapshot gt const prog Math round snapshot bytesTransferred snapshot totalBytes console log prog error gt console log ERRRRR alert Error inside upload file function error reject gt getDownloadURL uploadTask snapshot ref then downloadURL gt console log File available at downloadURL resolve downloadURL Once we have our uploadTask ready we can now move to the actual logic of uploading our file to the cloud storage The uploadTask has an on method which registers three observers state changed observer called any time the state changesError observer called on failureCompletion observer called on successful completion The first observer observe state change events such as progress pause and resume We also get task progress including the number of bytes uploaded and the total number of bytes to be uploaded using which we can get the total percentage that has been uploaded in real time The second observer handles any unsuccessful uploads In this block we can handle any error that we might get while uploading the file The third observer and the most important observer handles successful uploads on complete We can extract the url of the uploaded file so that we can store the URL of the file in our database rather then storing the entire files in our database Now uploading file to cloud storage is an asynchronous task so we wrap the entire thing in a promise We return a new promise which rejects if it reaches the error block or the second observer and resolves with the uploaded file url on successful completion of our upload Upload Post LogicOnce we have the logic to upload file to cloud storage ready we now can move on to the logic to store the URL in our database by sending a post request to our backend We create a new function named postSubmitHandler which is fired whenever we click on Post button And we declare the function as an async function because storing data into database and uploading file to the storage are both asynchronous task First and the most important thing to keep in in mind is to do e preventDefault or else the page will refresh whenever we click on Post button and our CODE WILL BLOW UP Next thing we do is to wrap our entire logic in a Try Catch block for that extra layer of precaution Now in the Try block we call our uploadTaskPromise and pass the file we need to upload as a parameter Don t forget to put the await keyword Now if the upload task is successful then the promise gets resolved with our stored file URL and we store that URL in a constant storageUrl In case the upload fails we reject the promise and we are thrown into the catch block where we can handle the error Now that we have the url all we need to do is to send a POST request to our backend The final consideration is what happens if we successfully store files to cloud storage but fail to store data to our database We will still be tossed to the catch block but the downloaded file will remain in our cloud storage doing nothing except wasting storage space To handle this we must remove the file that we saved to our storage by generating a reference to the file that has to be deleted similar to the reference that we generated while uploading our file to the storage Once we have the reference we can then call deleteObject that we import from firebase storage and pass the reference to it This will delete the file that we stored in our storage if something goes wrong while storing our data to database Thank you for your time Save the post so you may refer to it if you wish to use firebase storage in the future |
2021-12-20 16:56:32 |
海外TECH |
DEV Community |
Automate Azure Resource Decommissions (with tracking) |
https://dev.to/pwd9000/automate-azure-resource-decommissions-with-tracking-aok
|
Automate Azure Resource Decommissions with tracking OverviewToday we are going to look at a common use case around resource management in Azure how to manage our resource decommissions more effectively and even having the ability for our users to self serve a resource decommission by simply using an Azure tag and also be able to track decommissions or failed decommissions using a tracker table Azure table storage We can ease the management of handling our resource decommissions by simply using Tags and automate the decommission process using an Azure serverless Function App with Powershell as the code base set on a daily run trigger We will also utilize the Funtion Apps own storage account to create two tables One called Tracker to track successful decommissions by resource ID and date of decommission and also a table called Failed in which we will track failed decommissions Say for example if a resource had a resource lock on it or some sort of other failure that does not allow our automation to successfully complete the decommission task So in this demo I will be using a Resource Tag called Decommission The value will be a date format of dd MM yyyy Tag KeyTag ValueDecommissiondd MM yyyyThe idea is simple place the Decommission tag on the resource OR resource group that you would like to decommission as well as the date that you want that decommission to take place on The function app will run on a daily Cron schedule and search resources resource groups that are tagged with the Decommission key and evaluate based on the current Date whether the decommission should be initiated or not and also track the decommission by recording the event into an Azure Storage Account Table with the resource ID and date of the successful failed decommission so that we can track and audit our automated events Pre RequisitesTo set up everything we need for our function app I wrote a PowerShell script using AZ CLI that would build and configure all the things needed There was one manual step however I will cover a bit later on But for now you can find the script I used on my github code page called Azure Pre Reqs ps First we will log into Azure by running az loginAfter logging into Azure and selecting the subscription we can run the script that will create all the resources and set the environment up code Azure Pre Reqs ps Log into Azure az login Setup Variables randomInt Get Random Maximum subscriptionId get azcontext Subscription Id resourceGroupName Automated Resource Decommissioning storageName decomfuncsa randomInt tableName Tracker tablePartition Decommissioned functionAppName decomfunc randomInt region uksouth scopes subscriptionId Array of Subscriptions that will be covered by automate decommissioning e g subscriptionId subscriptionId Create a resource resourceGroupNameaz group create name resourceGroupName location region Create an azure storage account for function appaz storage account create name storageName location region resource group resourceGroupName sku Standard LRS kind StorageV https only true min tls version TLS Create a Function Appaz functionapp create name functionAppName storage account storageName consumption plan location region resource group resourceGroupName os type Windows runtime powershell runtime version functions version assign identity Configure Function App environment variables settings Function Scopes scopes Function RGName resourceGroupName Function SaActName storageName Function TableName tableName Function TablePartition tablePartition settings foreach object az functionapp config appsettings set name functionAppName resource group resourceGroupName settings Authorize the operation to create the tracker table Signed in Useraz ad signed in user show query objectId o tsv foreach object az role assignment create role Reader and Data Access assignee scope subscriptions subscriptionId resourceGroups resourceGroupName providers Microsoft Storage storageAccounts storageName az role assignment create role Storage Table Data Contributor assignee scope subscriptions subscriptionId resourceGroups resourceGroupName providers Microsoft Storage storageAccounts storageName Create Tracker Table in Function storage accStart Sleep s storageKey az storage account keys list g resourceGroupName n storageName query value o tsvaz storage table create account name storageName account key storageKey name tableName Create Table in Function storage to track failed decommissionsaz storage table create account name storageName account key storageKey name Failed Assign Function System MI permissions to Storage account Read and table Write and contributor to subscription to be able to do decommissions functionMI az resource list name functionAppName query identity principalId out tsv foreach object az role assignment create role Reader and Data Access assignee scope subscriptions subscriptionId resourceGroups resourceGroupName providers Microsoft Storage storageAccounts storageName az role assignment create role Storage Table Data Contributor assignee scope subscriptions subscriptionId resourceGroups resourceGroupName providers Microsoft Storage storageAccounts storageName tableServices default tables blogs az role assignment create role Contributor assignee subscription subscriptionId NOTE In the Azure Pre Reqs ps script the following variable scopes will determine which scopes the decommission function will search for resources to decommission code Azure Pre Reqs ps L L scopes subscriptionId Array of Subscriptions that will be covered by automate decommissioning e g subscriptionId subscriptionId Lets take a closer look step by step what the above script does as part of setting up the environment Create a resource group called Automated Resource Decommissioning Create an azure storage account for the function app Create a PowerShell Function App with SystemAssigned managed identity consumption app service plan and insights Configure Function App environment variables Will be consumed inside of function app later Create Tracker and Failed storage tables in the function apps storage account Assign Function App SystemAssigned managed identity permissions to Storage account Read table Write and subscription Contributor Remember I mentioned earlier there is one manual step In the next step we will change the requirements psd file on our function to allow the AZ module inside of our function by uncommenting the following as well as adding a module to be installed called AzTable This file enables modules to be automatically managed by the Functions service See for additional information For latest supported version go to To use the Az module in your function app please uncomment the line below Az AzTable NOTE Remember to save the manual change we made on requirements psd above Our environment is now set up and in the next section we will configure the function to run automated decommissions and schedule a timer Decommission FunctionThe following function app code can also be found under my github code page called run ps Navigate to the function app we created in the previous section and select Create under Functions Select Develop in portal and for the template select Timer trigger name the function ResourceDecommission set the cron schedule to run on the frequency you need in my case I have set this to once a day at pm and hit Create NOTE You can change the cron timer trigger anytime by going to the functions Integration section Navigate to Code Test and replace all the code under run ps with the following powershell code and hit save code run ps Input bindings are passed in via param block param Timer Get the current universal time in the default string format currentUTCtime Get Date ToUniversalTime The IsPastDue property is true when the current function invocation is later than scheduled if Timer IsPastDue Write Host PowerShell timer is running late GET RESOURCEGROUPDECOM Function Get ResourceGroupDecom CmdletBinding SupportsShouldProcess Param Parameter Mandatory ValueFromPipeline String subscriptionId Parameter Mandatory ValueFromPipeline String ResourceGroupName Parameter Mandatory false ValueFromPipeline ValidateSet Decommission String TagKey Decommission Parameter Switch Future Set context and Get date and format null Set AzContext Subscription subscriptionId date get date format dd MM yyyy Get Resource Object and Tags objResourceGroup Get AzResourceGroup Name ResourceGroupName ErrorAction SilentlyContinue objTags objResourceGroup Tags Get the matching key and value provided If Future If objTags Keys contains TagKey tagValue objTags TagKey If tagValue lt date or tagValue eq date ResourceGroup objResourceGroup return ResourceGroup Else If objTags Keys contains TagKey tagValue objTags TagKey If tagValue gt date ResourceGroup objResourceGroup return ResourceGroup GET RESOURCEDECOM Function Get ResourceDecom CmdletBinding SupportsShouldProcess Param Parameter Mandatory ValueFromPipeline String ResourceId Parameter Mandatory false ValueFromPipeline ValidateSet Decommission String TagKey Decommission Parameter Switch Future Determine resource Subscription and set context subscription ResourceId Split date get date format dd MM yyyy Get Resource Object and Tags objResource Get AzResource ResourceId ResourceId ErrorAction SilentlyContinue objTags ObjResource Tags Get the matching key and value provided If Future If objTags Keys contains TagKey tagValue objTags TagKey If tagValue lt date or tagValue eq date Resource pscustomobject Subscription Get AzSubscription SubscriptionId subscription Name ResourceGroup ObjResource ResourceGroupName ResourceType ObjResource ResourceType ResourceID ObjResource ResourceId ResourceName ObjResource Name DecommissonDate objTags TagKey return Resource Else If objTags Keys contains TagKey tagValue objTags TagKey If tagValue gt date Resource pscustomobject Subscription Get AzSubscription SubscriptionId subscription Name ResourceGroup ObjResource ResourceGroupName ResourceType ObjResource ResourceType ResourceID ObjResource ResourceId ResourceName ObjResource Name DecommissionDate objTags TagKey return Resource Decom Section Needed modules Install Module AzTable forceImport Module AzTable Set these environment variables up in Function App settings scopes env Function Scopes Split SubscriptionIds to scan for decommissions resourceGroupName env Function RGName Function RG name for tracking storageAccountName env Function SaActName Function SA Acc Name for tracking trackingTableName env Function TableName Table storage Tracker trackingTablePartition env Function TablePartition Table partition Decommissioned Set Tracker context storageAccount Get AzStorageAccount ResourceGroupName resourceGroupName Name storageAccountName storageContext storageAccount Context Get Resource Groups to Decom MatchedResourceGroups Foreach scope in scopes null Set AzContext Subscription scope ResourceGroupNames Get AzResourceGroup ResourceGroupName Foreach rg in resourceGroupNames MatchedResourceGroups Get ResourceGroupDecom SubscriptionId scope ResourceGroupName rg ResourceId Decom Resource GroupsForeach rID in MatchedResourceGroups Where Object ne null Write Host Decommissioning rID Resource rID Replace Remove AzResource ResourceId rID Force ErrorAction Continue If cloudTable Get AzStorageTable Name trackingTableName Context storageContext CloudTable Add AzTableRow Table cloudTable PartitionKey trackingTablePartition RowKey Resource else failureMessage error Exception message ToString Write Host failureMessage cloudTable Get AzStorageTable Name Failed Context storageContext CloudTable Add AzTableRow Table cloudTable PartitionKey Failed Decommission RowKey Resource Get Resources to Decom ResourceIds Foreach Scope in scopes null Set AzContext Subscription Scope ResourceIds Get AzResource Select object ResourceId MatchedResources Foreach Id in ResourceIds MatchedResources Get ResourceDecom ResourceId Id ResourceId ResourceID Decom ResourcesForeach rID in MatchedResources Where Object ne null Write Host Decommissioning rID Resource rID Replace Remove AzResource ResourceId rID Force ErrorAction Continue If cloudTable Get AzStorageTable Name trackingTableName Context storageContext CloudTable Add AzTableRow Table cloudTable PartitionKey trackingTablePartition RowKey Resource else failureMessage error Exception message ToString Write Host failureMessage cloudTable Get AzStorageTable Name Failed Context storageContext CloudTable Add AzTableRow Table cloudTable PartitionKey Failed Decommission RowKey Resource Lets take a closer look at what this code actually does In the first few lines we can see that the function app will take an input parameter called Timer This parameter is linked to the cron timer we set when we created the function app earlier Next we are loading two Powershell functions one that will evaluate and return Resources to be decommissioned and another to return Resource Groups to be decommissioned You can look at each of these PowerShell functions individually on my GitHub code page as well Get ResourceDecom ps and Get ResourceGroupDecom psNow comes the main section that will process decommissions first we set some variables These variables are from the function apps Application Settings our pre req script created NOTE Scopes can be one or more comma separated subscription IDs that the function app will search for resources or resource groups to decommission If you have more than one subscription defined please ensure that the function apps managed identity has the relevant IAM RBAC access over any additional subscriptions you want the function app to cover The remaining code will consume the two loaded functions to return the Resource IDs of the relevant resources and resource groups that will be decommissioned and also record this Resource ID on successful decommission in the functions storage account under a table called Tracking or if the decommission failed the resource ID of the failed decommission will be recorded in the table called Failed Resource Groups are decommissioned first Followed by resources Testing the function appLets test our function app and see if it does what it says on the tin In my environment I set up x Resource Groups and x Storage Accounts as per the below table I also placed a delete resource lock on some resources so that the decommission process would fail to see if those resources gets recorded in my Failed table Test Run dateResource NameResource TypeTag KeyTag ValueResource Lock Applied TestRGResource GroupDecommission FALSE TestRGResource GroupDecommission TRUE TestRGResource GroupDecommission FALSE pwdsaStorage AccountDecommission FALSE pwdsaStorage AccountDecommission FALSE pwdsaStorage AccountDecommission TRUESo based on my test date and looking at the above table I would expect TestRG and pwdsa to successfully be removed and recorded in my Tracking table when my function app is triggered This is in fact what happened and I can see that those resources are no longer in my Azure subscription and was also recorded I can also see that TestRG and pwdsa which had resource locks enabled failed to decommission and was also recorded in my Failed Table Everything is working as expected Now we can easily preset resources we would like to decommission by simply just adding a tag to the resource or resource group with the tag key of Decommission and tag value of the date we want the resource to be decommission on in date format dd MM yyyy We could also give our users access to set tags themselves on resources they manage to allow them to remove resources in a controlled manner Our function app will just run on its schedule and decommission resources which are tagged and also track the automated decommissions and failures in table storage I hope you have enjoyed this post and have learned something new You can also find the code samples used in this blog post on my Github page ️ AuthorLike share follow me on GitHub Twitter LinkedIn ltag user id follow action button background color cbb important color important border color cbb important Marcel LFollow Cloud Solutions amp DevOps Architect |
2021-12-20 16:50:33 |
海外TECH |
DEV Community |
React Native Lessons & Best Practices after 2 years of development |
https://dev.to/stevepepple/react-native-lessons-best-practices-after-2-years-of-development-1ag
|
React Native Lessons amp Best Practices after years of developmentMy team has been using React Native for about years now I d like to share some lessons resources and best practices regarding React Native development We are on the lookout for any more tips and guides to performance UX and other good practices so please share other suggestions in the comments Our company Vibemap has a travel amp lifestyle app for iOS amp Android that is build with React Native and Mapbox We knew we needed our app on Android from day one and also need to provide a consumer level experience React Native has made it possible to deliver an app on both platforms but it s also present us with a number of challenges One of the amazing benefits of React Native is being able to have a common code base that can be released to iOS and Android The abstractions are not always perfect and we have found that in some scenarios who still need to write native platform specific modules for iOS and Android and that means you might need to write some Swift Objective C or Java As our project grew in complexity the build process also wasn t seamless especially the Gradle steps on Android It s not just code sharing between iOS and Android that has helped us since React Native is Javascript and React we ve been able to share most of our business logic and helper functions between our website and mobile app In theory you can even compile React Native components for web but that introduced some overhead in terms of webpack bundle size and complexity Webpack is a whole other story We also have a design system of colors typography layout rules and component styles Use design tokens and styled components we ve been able to reuse most but not all of our styles between web and mobile Here are the top ten lessons and best practices that have helped our team in our journey with React Native and a few of the articles that have helped us Optimize React Native Performance in How to improve the performance of a React Native app best practices that will increase React Native performance Keep components small and avoid excessive rendersKeeping your project in a consistent structure of screens high order components and UI components help keep you organized and productive and also makes it easier to employ code splitting lazy loading and other performance techniques The React memo and useMemo APIs are two distinct tools that help to prevent re rendering Enable Hermes and Keep React Native Up To DateHermes has given our app a dramatic performance improvements on both iOS and Android “Hermes helps reduce the download size of the APK the memory footprint and consumption and the time needed for the app to become interactive TTI Time to Interact CodemagicGetting the packaging of Javascript into bytecode on Android was a little mystifying but we finally got it all working Use a UI library with discretion There are many good out of the box components in React Native That said we ve found that there some excellent UI library that provide a UX that rivals the native experience We ve tried to keep the experience consistent with iOS and Android guidelines The React Native Paper library provides us with a nice selection of basic components and also Material Design components Use a Design SystemRelated to the UI library suggestion above using a design system of reusable components has allowed us to keep the app experience consistent and invest our effort into making cards list buttons and other elements fast and responsive Use Flat ListOptimize list and make sure they have a key attribute which reduce re renders if the components data doesn t change For long list use instead of “it s advisable to choose FlatList over ScrollView to render all countable items with attributes of lazy loading that helps in improving the app s performance “ Omji Mehrota Clean up console statements and environment variableAny console log statements will add overhead to the Javascript thread We also discovered from a friendly user s security audit that by default React Native was storing some of our environment and config variables into an insecure place in the app bundle Use Firebase Crashalytics amp Performance MonitoringAnother tool that s been super helpful to our efforts is Firebase and their crashalytics tool iOS and Android will also report crashed for React Native but it can be difficult to diagnose the root cause We found the stack traces in Firebase to be more informative Plus you can report on all other app analytics and filters to specific devices that are exhibiting issue Server Response amp Payload SizeOne area where are team is still working diligently is to speed up APIs and other data served to the mobile client Our app loads data from a few different APIs and we found that loading data for lists and maps was a major bottleneck Similarly images should be compressed and served in next gen formats like Wepb Enable Ram Format“Using the RAM format on iOS will create a single indexed file that React Native will load one module at a time “ Codemagic guide Note that if you enable Hermes this optimization is already implemented Reduce App Bundle SizeIt probably can go without saying that you should remove any unused libraries and components That said as a project grows and changes existing modules can be left in your package json So it s a good practice to regularly update to newer versions and check their impact on your overall app size We greatly reduced the size of our app by replace moment js with day js and used native Javascript methods in favor of LoDash Thanks for reading Please send other suggestions and I ll keep this article up to date stevepepple on Twitter and my personal blog |
2021-12-20 16:14:23 |
海外TECH |
DEV Community |
How to protect your JavaScript Code |
https://dev.to/snowbit/how-to-protect-your-javascript-code-2c3b
|
How to protect your JavaScript CodeHello I am back with another article and it will be funLet s get s started What does Obfuscate mean In JavaScript obfuscation means to protect code which is exposed on the web Some Benefits of ObfuscationHelps to protect secret credentials for attackersHelps to hide some functions Prevent people from copying or modifying your code without authorization The obfuscated JavaScript will be way larger and difficult to understand How to obfuscate Code not obfuscatedfunction hello console log Hello world hello Obfuscated Code function xface xdc var xbe xaed xd xface while try var xe parseInt xbe x x parseInt xbe x x parseInt xbe x x parseInt xbe x x parseInt xbe x x parseInt xbe xa x parseInt xbe x x parseInt xbe xe x parseInt xbe x x parseInt xbe x xa parseInt xbe xf xb parseInt xbe x xc if xe xdc break else xd push xd shift catch x xd push xd shift xc xfc function hello var x xaed console log x x function xaed xab xdb var xcc xc return xaed function xaed xe xaed xaed xe var xd xcc xaed return xd xaed xab xdb function xc var xab tFyzVa QHsemb BSBOlC FeGZkj ERaIXL Hello xworld yHbwCr XMVpPJ fWLzKR zYLYOQ AroHuK jkUgBM JRGabm xc function return xab return xc hello Test them both will have same output How to obfuscate Write javascript codeCopy and paste to Obfuscator ioAnd replace obfuscated code with the original Make sure to check out my Youtube Channel and subscribe to it |
2021-12-20 16:02:35 |
海外TECH |
Engadget |
Beats Studio Buds are down to $100 at Adorama |
https://www.engadget.com/beats-studio-buds-good-deal-adorama-162941480.html?src=rss
|
Beats Studio Buds are down to at AdoramaIf you re scrambling for a last minute holiday gift it might be worth checking out a solid deal on Beats Studio Buds The earbuds have dropped from to which matches the lowest price we ve seen for them to date Buy Beats Studio Buds at Adorama We included Beats Studio Buds on our list of the best wireless earbuds on the market after giving them a score of in our review They offer good sound quality with the kind of bass levels that Beats gear is known for The earbuds which have IPX sweat and water resistance include active noise cancellation and an option that automatically adjusts the volume depending on environmental audio levels Studio Buds have Apple s H chip Beats is owned by Apple after all and they support fast pairing with both Android and iOS devices You can use them for AirPods style seamless switching between Apple products and they support the company s Spatial Audio feature In addition Studio Buds work with Apple s Find My and Android s Find My Device There s also hands free Siri support On the downside we felt that call quality wasn t great and the lack of wireless charging support might be disappointing for some There are no on board volume controls or any options for customizing the sound either Still these are a worthy option for those looking for a set of true wireless earbuds |
2021-12-20 16:29:41 |
Cisco |
Cisco Blog |
Disabilities Around Us – Turning Awareness into Action |
https://blogs.cisco.com/diversity/disabilities-around-us-turning-awareness-into-action
|
Disabilities Around Us Turning Awareness into ActionAt Cisco our purpose is to power an inclusive future for all I truly believe those words inclusive and for all must underpin everything we do at Cisco Over of Cisco employees participate in Cisco s inclusive communitiesーa group of employee resource organizations EROs and employee networks that help our people connect |
2021-12-20 16:40:00 |
Cisco |
Cisco Blog |
3 Technology Trends in the Future of Government |
https://blogs.cisco.com/government/3-technology-trends-in-the-future-of-government
|
Technology Trends in the Future of GovernmentWelcome to the first blog in our SecureGovernment series Today we look at the technological trends that are likely to drive the future of government In recent times we ve seen an unprecedented rate of digital transformation in the public sector While the government is traditionally built for stability and risk minimization agility and rapid adoption were |
2021-12-20 16:35:10 |
海外科学 |
NYT > Science |
Parents Grapple With How Long to Wait for Their Children’s Second Shots |
https://www.nytimes.com/2021/12/20/health/kids-covid-vaccine-second-dose.html
|
Parents Grapple With How Long to Wait for Their Children s Second ShotsWaiting eight weeks or more between doses may boost immunity But as Omicron slams the United States waiting also comes with risks |
2021-12-20 16:56:27 |
海外科学 |
NYT > Science |
E.P.A. Announces Tighter Auto Pollution Rules |
https://www.nytimes.com/2021/12/20/climate/tailpipe-rules-climate-biden.html
|
average |
2021-12-20 16:02:52 |
金融 |
金融庁ホームページ |
鈴木財務大臣兼内閣府特命担当大臣閣議後記者会見の概要(令和3年12月10日)について公表しました。 |
https://www.fsa.go.jp/common/conference/minister/2021b/20211210-1.html
|
内閣府特命担当大臣 |
2021-12-20 17:30:00 |
金融 |
金融庁ホームページ |
バーゼル銀行監督委員会による「気候関連金融リスクの実効的な管理と監督のための諸原則」について掲載しました。 |
https://www.fsa.go.jp/inter/bis/20211118/20211118.html
|
関連 |
2021-12-20 17:00:00 |
ニュース |
ジェトロ ビジネスニュース(通商弘報) |
11月のインフレ率は2.5%と減速も、12月以降は再び加速の見通し |
https://www.jetro.go.jp/biznews/2021/12/d8a5819eb2ec793c.html
|
減速 |
2021-12-20 16:40:00 |
ニュース |
ジェトロ ビジネスニュース(通商弘報) |
続く部材の逼迫、多くの国内製造業企業に影響 |
https://www.jetro.go.jp/biznews/2021/12/b69e03975ce21810.html
|
部材 |
2021-12-20 16:30:00 |
ニュース |
ジェトロ ビジネスニュース(通商弘報) |
ニューデリーで日本酒PR・商談会を相次いで実施 |
https://www.jetro.go.jp/biznews/2021/12/6f0f5d5bdb9dbcf2.html
|
日本酒 |
2021-12-20 16:20:00 |
ニュース |
ジェトロ ビジネスニュース(通商弘報) |
大連港と海外を結ぶコンテナ路線、貨物鉄道路線が相次いで開通 |
https://www.jetro.go.jp/biznews/2021/12/0464984cabf7487d.html
|
貨物鉄道 |
2021-12-20 16:10:00 |
ニュース |
BBC News - Home |
Covid: No guarantees over Christmas lockdown, says Dominic Raab |
https://www.bbc.co.uk/news/uk-59725266?at_medium=RSS&at_campaign=KARANGA
|
covid |
2021-12-20 16:51:02 |
ニュース |
BBC News - Home |
Premier League and EFL clubs to fulfil fixture list despite Covid-19 disruption |
https://www.bbc.co.uk/sport/football/59732905?at_medium=RSS&at_campaign=KARANGA
|
Premier League and EFL clubs to fulfil fixture list despite Covid disruptionPremier League and EFL clubs decide to keep all fixtures in place over the festive period despite ongoing Covid disruption |
2021-12-20 16:52:45 |
ニュース |
BBC News - Home |
Sir Richard Sutton: Partner's son gets life sentence for murder |
https://www.bbc.co.uk/news/uk-england-dorset-59725874?at_medium=RSS&at_campaign=KARANGA
|
attack |
2021-12-20 16:42:48 |
ニュース |
BBC News - Home |
Prosecutors call Ghislaine Maxwell 'sophisticated predator' in closing arguments |
https://www.bbc.co.uk/news/world-us-canada-59730923?at_medium=RSS&at_campaign=KARANGA
|
children |
2021-12-20 16:32:16 |
ニュース |
BBC News - Home |
Sale Sharks sign England and Exeter lock Hill for next season |
https://www.bbc.co.uk/sport/rugby-union/59731026?at_medium=RSS&at_campaign=KARANGA
|
chiefs |
2021-12-20 16:02:16 |
北海道 |
北海道新聞 |
三菱電機、発電機に不具合 1240台、04年から把握 |
https://www.hokkaido-np.co.jp/article/625439/
|
三菱電機 |
2021-12-21 01:09:00 |
コメント
コメントを投稿