投稿時間:2022-03-12 00:34:39 RSSフィード2022-03-12 00:00 分まとめ(38件)

カテゴリー等 サイト名等 記事タイトル・トレンドワード等 リンクURL 頻出ワード・要約等/検索ボリューム 登録日
AWS AWS Startups Blog Hacking for Social Good: How AWS Hypercharged Our Hackathon https://aws.amazon.com/blogs/startups/hacking-for-social-good-how-aws-hypercharged-our-hackathon/ Hacking for Social Good How AWS Hypercharged Our HackathonNonprofits are often overwhelmed by the amount of data they accumulate and lack the resources to generate value from it In response to this challenge charitable organization Data Science for Social Good DSSG Berlin was founded As part of their mission to enlist volunteer data scientists and analysts to help nonprofits use their data properly DSSG Berlin hosted Datathon a data science hackathon powered by AWS services 2022-03-11 14:43:10
python Pythonタグが付けられた新着投稿 - Qiita NFCで少し気になったこと https://qiita.com/pentyan0303/items/90fd17741a120e6f256f 右から読み取り易い順にAime→BANAPASSPORT→ICOCA考察NFCに対する知識が未熟故に、この結果から考えられることは少ないが、カードの種類や組み込みのデータ量の差がこの読み取りの易さに影響を及ぼしているのではないかと考える。 2022-03-11 23:57:21
python Pythonタグが付けられた新着投稿 - Qiita Windows環境のPythonでN-gram生成ツール SRILM を使う方法 https://qiita.com/phi934/items/440337e6d18f465f688f SRILMhomespeechstolckeprojectsrilmdevel見つけた行を次のように書き換え、保存する。 2022-03-11 23:52:51
python Pythonタグが付けられた新着投稿 - Qiita BatfishでNW障害時の影響分析を行う② https://qiita.com/kitara/items/2ea1ffb281b589b15b46 snippetheadersHeaderConstraintsdstIpssrcIps最後に今回は小規模なネットワークで、パターンの通信経路のみで影響分析を実施しました。 2022-03-11 23:42:27
python Pythonタグが付けられた新着投稿 - Qiita PyCharmのTerminalで仮想環境が自動起動せずに躓いた話 https://qiita.com/__Lily__/items/71c75566f067c19c18aa mainpyimportsysprintsysexecutableこれは、実行中のpythonexeのパスを表示するスクリプトです。 2022-03-11 23:36:29
python Pythonタグが付けられた新着投稿 - Qiita SeleniumをEXEで動かす https://qiita.com/middle_aged_rookie_programmer/items/dd82febac13072ec148e 私のPCの場合、CProgramFilesxGoogleChromeApplicationの中のフォルダとchromeexeをコピーした。 2022-03-11 23:32:46
python Pythonタグが付けられた新着投稿 - Qiita Yolo V5のアノテーションtxtファイルの覚書 https://qiita.com/4vent/items/30b6d33c85b33ce48179 2022-03-11 23:09:05
Ruby Rubyタグが付けられた新着投稿 - Qiita 【超初心者向け】rails migrationファイルの削除方法 https://qiita.com/wak10/items/f3531d0fa2b51697e982 次の解説でstatusをdownしてから、いらないmigrationを消し去りましょうマイグレーションファイルを削除する方法手順まとめmigrationをdownにするrailsdbmigratedownVERSIONマイグレーションIDmigrationファイルを削除するrmdbmigrateマイグレーションIDhogerb【解説】【】migrationをdownにするstatusで確認したMigrationIDを以下コマンドに当てはめますrailsdbmigratedownVERSIONMigrationID実際に動かしてみるとecuserenvironmentsamplemainrailsdbmigratedownVERSIONHogerevertingHogereverteds再度railsdbmigratestatusで確認するとdownになっているはずです【】migrationを削除downにできたのでこれでファイルの削除ができるようになりましたあとは下のコマンドで瞬殺ですコマンドではmodelの頭文字は小文字でOKですrmdbmigratehogerbNOFILEを削除する方法手順まとめNOFILEを名前をつけて復活させるtouchdbmigrateマイグレーションIDhogerbmigrationファイルに内容記載記載した内容を反映させるrailsdbmigratemigrationをdownにするrailsdbmigratedownVERSIONマイグレーションIDmigrationファイルを削除するrmdbmigrateマイグレーションIDhogerb【解説】【】NOFILEを名前をつけて復活させるmodelを先に消してしまった場合、statusを確認すると、NOFILEとなっています。 2022-03-11 23:50:31
Ruby Rubyタグが付けられた新着投稿 - Qiita 本日のRuby基礎練習問題(22/3/11) https://qiita.com/t-tokio/items/7879838ff731b5cbfd48 まずは、までの数字をターミナルに出力するためのプログラムを書きます。 2022-03-11 23:10:34
AWS AWSタグが付けられた新着投稿 - Qiita aws-cdkで環境構築 #1 https://qiita.com/KbSota/items/48d9d8ffaeccd775714e インストールが完了すると実行ディレクトリ配下に以下のようなファイルが作成されます。 2022-03-11 23:47:47
Azure Azureタグが付けられた新着投稿 - Qiita OWIN ライブラリを使って OIDC に対応するサンプル Web アプリケーションを動かしてみた https://qiita.com/hiroakimurata/items/0e47dfd19b584d05305c BAzureADを使用したユーザーのサインオン動作を確認アプリケーションのユーザー認証を行うAzureADへアプリの登録を行い、必要な情報の取得と認証の構成を行う。 2022-03-11 23:13:41
技術ブログ Developers.IO Amazon Cognito UserPoolによる認証機能をAWS CLIでサクッと作成する https://dev.classmethod.jp/articles/quickly-create-an-authentication-function-with-amazon-cognito-userpool-with-the-aws-cli/ amazoncognito 2022-03-11 14:53:16
海外TECH Ars Technica EU and UK open antitrust probe into Google and Meta over online ads https://arstechnica.com/?p=1840350 adsonline 2022-03-11 14:40:59
海外TECH MakeUseOf How to Get Started With Tasker, the Best Android Automation App https://www.makeuseof.com/tag/tasker-android-mobile-app-caters-whim/ How to Get Started With Tasker the Best Android Automation AppTasker is the best automation app for Android In this beginner s guide we ll show you what you can use it for and how to get started 2022-03-11 14:45:13
海外TECH MakeUseOf How Your Instagram Account Can Be Hacked and How To Stop It https://www.makeuseof.com/how-instagram-hacked-stop-it/ instagram 2022-03-11 14:45:13
海外TECH MakeUseOf How to Delete or Restore Files in Google Drive https://www.makeuseof.com/how-to-delete-restore-files-google-drive/ account 2022-03-11 14:30:13
海外TECH MakeUseOf final ZE3000 Review: Stylish Earbuds Deliver Excellent Sound https://www.makeuseof.com/final-ze3000-review/ design 2022-03-11 14:05:14
海外TECH DEV Community Basics of AWS Tags & Terraform with S3 - Part 1 https://dev.to/cloudforecast/basics-of-aws-tags-terraform-with-s3-part-1-577i Basics of AWS Tags amp Terraform with S Part Managing AWS resources can be an extremely arduous process AWS doesn t have logical resource groups and other niceties that Azure and GCP have This nonwithstanding AWS is still far and away the most popular cloud provider in the world Therefore it s still very important to find ways to organize your resources effectively One of the most important ways to organize and filter your resources is by using AWS tags While tagging can be a tedious process Terraform can help ease the pain by providing several ways to tag your AWS resources In this blog and accompanying video series we re going to take a look at various methods and strategies to tag your resources and keep them organized efficiently These posts are written so that you can follow along You will just need an environment that has access to the AWS API in your region I typically use AWS Cloud for this purpose but any environment with access will do Github repo Tag BlocksThe first method we can use to tag resources is by using a basic tag block Let s create a main tf file and configure an S bucket to take a look at this Configure Terraform to use the AWS providerterraform required providers aws source hashicorp aws version gt Configure the AWS Providerprovider aws region us west Create a random ID to prevent bucket name clashesresource random id s id byte length We utilize the random id function to create the entropy needed in our bucket names to ensure we do not overlap with the name of another S bucket Create an S Bucket w Terraform and Tag Itresource aws s bucket devops bucket bucket devops bucket random id s id dec tags Env dev Service s Team devops Now let s run terraform apply auto approve Once the apply is finished let s run terraform console and then run aws s bucket devops bucket tags to verify the tags gt aws s bucket devops bucket tagstomap Env dev Service s Team devops To exit the console run exit or ctrl c You can also just run terraform state show aws s bucket devops bucket tags terraform show or just scroll up through the output to see the tags As you can see AWS tags can be specified on AWS resources by utilizing a tags block within a resource This is a simple way to ensure each s bucket has tags but it is in no way efficient Tagging every resource in AWS like this is not only tedious and the complete opposite of the DRY Don t Repeat Yourself principle but it s also avoidable to an extent Default AWS Tags amp TerraformIn order to specify deployment wide tags you can specify a default tags block within the provider block This will allow you to specify fallback tags for any resource that has no tags defined If however you do specify tags on a specific resource those tags will take precedence Let s take a look Using Terraform to Create a Second S bucketresource aws s bucket finance bucket bucket cloudforecast finance random id s id dec tags Env dev Service s Team finance Once you have added the second bucket definition and saved the file go ahead and apply the configuration with terraform apply auto approve Once you have applied you can run terraform console and access both buckets by their resource name gt aws s bucket devops bucket tagstomap Env dev Service s Team devops gt aws s bucket finance bucket tagstomap Env dev Service s Team finance If we were to deploy s s or even s of resources this would not be very efficient Let s add default tags to make this more efficient Add Default AWS Tags w TerraformWithin the provider block of our configuration add the default tag in order to assign both resources the Env tag provider aws region us west default tags tags Env dev Remove Env tags w TerraformNow that we ve added the default tags let s remove the Env tag from the AWS S buckets resource aws s bucket devops bucket bucket devops bucket random id s id dec tags Service s Team devops resource aws s bucket finance bucket bucket finance bucket random id s id dec tags Service s Team finance Run terraform apply auto approve again and once it s finished deploying run terraform console Within the console type the resource address of each S bucket and view the output gt aws s bucket devops bucket tagstomap Service s Team devops gt aws s bucket finance bucket tagstomap Service s Team finance Do you notice something missing Default tags are not displayed within the tags attribute Default tags are found within the tags all attribute so re run the previous commands with tags all replacing tags gt aws s bucket devops bucket tags alltomap Env dev Service s Team devops gt aws s bucket finance bucket tags alltomap Env dev Service s Team finance There they are Keep this in mind If you are querying the state to perform actions based on tags you will want to use the tags all attribute instead of just tags by themselves Tag PrecedenceNow for one last quick test to see the tag precedence in action let s add the Env tag back to our finance bucket but define it as prod instead of dev resource aws s bucket finance bucket bucket finance bucket random id s id dec tags Env prod Service s Team finance Run terraform apply auto approve again aws s bucket finance bucket will be updated in place resource aws s bucket finance bucket id finance bucket tags Env prod unchanged elements hidden tags all Env dev gt prod unchanged elements hidden unchanged attributes hidden Notice the changes made then run terraform console gt aws s bucket finance bucket tags alltomap Env prod Service s Team finance Notice the Env tag has now been changed to prod our updated value overriding the default tags Destroy ResourcesNow if you re ready go ahead and destroy your resources terraform destroy auto approve ConclusionAlright so now that we have an idea of how to assign custom tags and default tags join me on the next part in this series where we dive deeper 2022-03-11 14:57:20
海外TECH DEV Community Contributing to the Apache Airflow project - part two https://dev.to/aws/contributing-to-the-apache-airflow-project-part-two-4odh Contributing to the Apache Airflow project part twoThis is the second and concluding post providing an overview of the experience and journey contributing to the Apache Airflow project You can catch Part One here Contributing to Apache Airflow Part DeuxIn Part One of this series we took our first steps in contributing to the Apache Airflow project With a little bit more knowledge and experience our first interactions with the Airflow community we are ready to start exploring how the code works and see how we might go about fixing this Making sense of the codeBootstrapping my test DAGThe first thing I wanted to do was make sure that my test DAG works in my development setup This is going to make everything easier later on as I have a good starting point I start Breeze using breeze start airflow and then when started I eventually figure out I need to copy my DAG into the files dags folder for some reason I had just assumed it would automatically get loaded from the opt airflow dags folder but it was simple enough to just copy it from airflow import DAGfrom datetime import datetime timedeltafrom airflow providers amazon aws operators ecs import ECSOperatordefault args owner ubuntu start date datetime retry delay timedelta seconds with DAG airflow dag test catchup False default args default args schedule interval None as dag test ECSOperator task id test dag dag aws conn id aws default cluster test hybrid task definition test launch type EC overrides containerOverrides awslogs group ecs test awslogs stream prefix ecs testIt didn t work straight away as I needed to create a new aws conn connection called aws default from the Apache Airflow UI I added my ACCESS and SECRET ID and then in the extras I added region info this is all well covered in the Apache Airflow documentation Once I had completed that I triggered the DAG and could see that I could execute a task on Amazon ECS and everything was working as expected Note Within the Breeze environment I realised root airflow logs were were the various Apache Airflow processes were writing files and this helped me figure out what was going on when I encountered problems deploying this DAG You can use airflow info to get useful information when trying to figure things out For example it told me that apache airflow providers amazon was v and I am pretty sure that might be important later onMaking a small changeTo make sure I understand all the moving parts my initial plan is to make a small change and just change the text message that is returned by the ECS operator As this should not be a breaking change it just help me understand how to package up and then run the changed code in the dev setup so that when I make a potentially breaking change I can spend less time chasing red herrings First of all back to the docs the section in the CONTRIBUTION doc around Provider packages seems pretty key and tells me that I should just be able to make changes and they will be updated in my dev environment Let s see if this works ECSOperatorUnder the airflow providers amazon aws operators folder we have the only file that appears to be specific to Amazon ECS called ecs py Opening this file up in Cloud I make a quick change changing the text on line from Running ECS Task Task definition s on cluster s self task definition self clusterto Running ECS Task XX Task definition s on cluster s self task definition self clusterAnd then go back to the Apache Airflow UI and trigger the DAG I am very grateful when I see the following appear in the log AIRFLOW CTX EXECUTION DATE T AIRFLOW CTX DAG RUN ID manual T UTC ecs py INFO Running ECS Task XX Task definition test on cluster test hybrid UTC ecs py INFO EcsOperator overrides containerOverrides UTC base aws py INFO Credentials retrieved from loginWe are looking good We know we can make changes and they will be picked up Another step forward Unexpected resultsWhen is a fix not a fixIn part one I shared how I had raised this issue as a result of my tests running the ECS Operator task using a launchtype of EXTERNAL It was only at the previous stage that I realised something something that was important In the original failing DAG I can specified the ECS logging as follows awslogs group ecs test awslogs stream prefix ecs It was during some tests unrelated to this work that I realised that if this information was not configured correctly and the IAM permissions not done properly the tasks would fail from within the Amazon ECS console i e nothing related to Apache Airflow or the ECS Operator It was only when kicking off the above that I realised that I had incorrectly configured the logs I changed the DAG to as follows from airflow import DAGfrom datetime import datetime timedeltafrom airflow providers amazon aws operators ecs import ECSOperatordefault args owner ubuntu start date datetime retry delay timedelta seconds with DAG airflow dag test external catchup False default args default args schedule interval None as dag test ECSOperator task id test dag dag cluster test hybrid task definition test external launch type EXTERNAL overrides containerOverrides awslogs group ecs test external awslogs stream prefix ecs testWith the launchtype set to EXTERNAL but crucially and what I had failed to do when I had tested it the first time setting the logging to the correct AWS CloudWatch logs group and stream prefix From within Breeze and uploaded this DAG and kicked it off and it worked To be sure I take this DAG and upload it into my Managed Workflows for Apache Airflow MWAA DAGs folder wait a few seconds and then trigger that DAG Yup it definitely works The script runs does its ETL stuff and records everything in the AWS CloudWatch log group Oh dear all this time the issue was not that the ECS Operator did not support the launchtype of EXTERNAL but that I had incorrectly configured the logging within the ECS Task Improving the ECS OperatorOk so whilst we now realise we do not need to fix this issue what we can do is make some minor changes to help document that this new launchtype is supported and working The first thing to do is update the ecs py line so that it now reads param launch type the launch type on which to run your task EC EXTERNAL or FARGATE We want to add a new test as well and so looking at the tests in the tests folder we find that test ecs py is a good candidate From line we make a small change so that it includes a test run for the EXTERNAL launch type parameterized expand EC None None None launchType EC EXTERNAL None None None launchType EXTERNAL FARGATE None LATEST None launchType FARGATE platformVersion LATEST We validate this by running breeze tests tests providers amazon aws operators test ecs pyAnd we get the following output durations lt s hidden Use vv to show these durations passed warnings in s Note We know this new test has been accepted as before adding the test when you run this command you only get passed tests The final improvement we can make is to add an example DAG that shows this working We already have an example ecs fargate py so I create a new one called example ecs ec py This example can be used for both EC and EXTERNAL launch types Static code checkOne final thing that needs to be done for contributions is to run and fix all the static checks Whilst I do not think that this minor change will need to run these I wanted to follow through the standard process The Static code checks doc provides everything you need to know In my Cloud environment I first install the required toolpip install pre commitand then run it manually viapre commit runNote The first time I ran this it generated an error which was easily fixed as I had been using Python by changing to pip it installed the latest version ERROR The hook black requires pre commit version but version is installed Perhaps run pip gt install upgrade pre commit Which will generate output and take approx minutes to complete pre commit run all filesWhen I ran this you will get a lot of output Some of the more interesting ones I saw from the files that I submitted included the following Add license for all Python files Failed hook id insert license exit code files were modified by this hookreformatted airflow providers amazon aws example dags example ecs ec pyreformatted dags ecs external test pyAll done files reformatted files left unchanged All done files left unchanged trim trailing whitespace Failed hook id trailing whitespace exit code files were modified by this hookFixing airflow providers amazon aws example dags example ecs ec pyThe main thing for me was that all tests PASSED and I just needed to review and accept the changes above which led to a new local commit So we now have our change ready one new file and two updates to the existing code base We now commit these to our local working fork of Apache Airflow making sure that our commit message provides a link to the issue raised in GitHub git add git commit m add additional documentation and test cases git pushNext step moving through the process to get these included upstream Submitting the Pull Request PR If we review the Apache Airflow Contributor doc we can see the next step is Step Prepare PR Following these steps leads us to the first thing we need to do rebase our fork As the doc says this is the approach used within the Apache Airflow project Whilst I understand the concepts behind rebasing the fork this is the bit I probably struggled with the most From my local repo I first sync my local fork with Apache Airflow main The doc provides a link which shows you how you can do this from the GitHub UI Once synced I committed my local changes and then from the GitHub UI submitted my PR The Apache Airflow maintainers are super responsive and I quickly got contacted about the PR with some suggestions on how to improve it I had also noticed that one of the automated tests had failed the Documentation tests it turns out in my new example DAG I had used the deprecated ECSOperator rather than EcsOperator good catch DocumentationI somehow missed the Apache Airflow documentation guide which aside from sharing how documentation works provides commands that allow you to build the docs locally so you can check your changes by running breeze build docsIt was at this point that I saw lots of errors Apache Airflow uses Sphinx to generate the documentation and I had assumed it was going to be in markdown format A lot of the markdown I added was generating errors To save time you can get the docs to rebuild just the provider package you were working on this is covered in the Apache Airflow documentation guide and I found running this command provided me with a full list breeze build docs help breeze build docs package filter apache airflow providers amazonTo check the document is looking good you can run the documentation server by using this command docs start doc server shand then accessing it via a browser I highly recommend you try this out as the first few attempts what I thought my doc update would look like vs what actually appeared was very different Note There is a dedicated channel in the Apache Airflow community for documentation and they pointed me towards this guide on Sphinx One thing I would say is ALWAYS run pre commit run all files before you commit your docs The docs are very fussy which is a good thing and even a space or tab will cause the build to fail Submitting the PRAfter fixing the issue and updating the ECS Operator How to guide and then testing I pushed my changes and this triggered the automatic build process As this was my first PR whilst all the checks ran through OK it was good to have one of the maintainers go through and make sure everything was up to standard ConclusionSo what did I learn going through the process of contributing this update to the Apache Airflow project I would summarise the key things as The Apache Airflow project is well organised and has comprehensive documentation that makes it easy to know what is expected when contributing to this project Joining some of the mailing lists made me appreciate how much collaboration is going on within the project across a very diverse set of developers Engaging with the Apache Airflow via the online forums Slack in my case was a great way to ask for help and I found the various channels very welcoming and helpful The journey from issue to pull request may not take you where you think but that all contributions are valuable From a tooling perspective Cloud really worked well for me and I have kept my Apache Airflow contribution workspace hibernated I am sure I will be back soon so it is good to have everything already set up and ready to go I really though Breeze made it as well breeze to do a lot of the heavy lifting and it certainly made my life easier so well worth taking the time to get to know it Running the static tests would have saved me a lot of time and reduced the burden on the Apache Airflow maintainers and build servers and something I will focus more attention in future contributions I struggled with the documentation and it took much longer than I thought I probably spent times longer and it has definitely made me think twice about how I should approach this in the future I guess I struggled with the cognitive load of moving from markdown to sphinx The mechanism for testing and checking the documentation is really good and you get a high degree of confidence that your documentation changes are going to be good 2022-03-11 14:51:51
海外TECH DEV Community Miço ile Görev Sonuçları Raporu Oluşturmak https://dev.to/liman/mico-ile-gorev-sonuclari-raporu-olusturmak-2ibh Miço ile Görev SonuçlarıRaporu OluşturmakMiço ile istemcilerde çalıştırılan ve sonuçlarıLiman MYS Miço eklentisinde görüntülenebilen tüm sonuçlar rapor olarak dışarıaktarılabilir Bunun için rapor oluşturulmak istenen görev sonuçlarıekranıaçılır Ekranın sağüstünde yer alan Raporu dışa aktar butonuna tıklanır Rapor işlemi bittiğinde aşağıdaki gibi indir butonuna tıklanarak rapor indirilebilir Eğer işlem uzun sürecek ise pencerede de görüldüğüüzere rapor işlemi arkada çalışan bir işlem olduğundan pencere kapatılıp sonrasında raporlar sekmesinden indirilebilir 2022-03-11 14:51:03
海外TECH DEV Community Miço Görevlerini Dışa Aktarmak https://dev.to/liman/mico-gorevlerini-disa-aktarmak-520n Miço Görevlerini Dışa AktarmakMiço üzerinde daha önceden yazılmışveya hali hazırda bir yerde kullanılan görevleri başka bir ortama ek bir emek sarfetmeden taşımak mümkün Peki nasıl Oluşturulan bir görevin sağüstündeki seçeneklere tıklanır Gelen menüde görevi dışa aktar seçeneği seçilir Gelen pencerede indir butonuna tıklanarak görev daha sonra içe aktarılmak üzere dışa aktarılır 2022-03-11 14:50:44
海外TECH DEV Community Security news weekly round-up - 11th March 2022 https://dev.to/ziizium/security-news-weekly-round-up-11th-march-2022-3g62 Security news weekly round up th March It feels like last week but it s a month already I ll do my possible best to make it next week Anyways here is the review IntroductionWelcome in this week s review the story that we ll cover is mostly about bugs and data breaches that affected big names in the tech industry From Samsung Intel AMD and Nvidia So sit back and relax and let s do some review Cybercriminals who breached Nvidia issue one of the most unusual demands everThis story is hilarious First they the hackers requested that Nvidia push updates that allow them to mine cryptocurrencies on its GPU Then they changed their mind Like we do not need that anymore you have to open source the entire thing I am thinking they taught if you don t do it we ll do it ourselves Now you ll ask What gave the courage to make such demands well they stole some data from Nvidia We have an excerpt below but you ll love to read the entire story We decided to help mining and gaming community Lapsus members wrote in broken English We want nvidia to push an update for all series firmware that remove every lhr limitations otherwise we will leak hw folder If they remove the lhr we will forget about hw folder it s a big folder We both know lhr impact mining and gaming Samsung confirms hackers stole Galaxy devices source codeNothing is more damaging to a company than theft of intellectual properties That s what Samsung is going currently going through Excerpt from the article Lapsus shared the data they claim to be from Samsung along with a description of the contents If the summary is accurate Samsung has suffered a major data breach and details of many of its technologies and algorithms are now public DDoS attacks now use new record breaking amplification vectorDDoS attacks are not new However they are getting sophisticated year after year Meanwhile this attack abuse insecure devices Excerpt from the article As detailed in a report that Akamai shared with Bleeping Computer before publication a new attack vector relies on the abuse of insecure devices that serve as DDoS reflectors amplifiers For this new DDoS method threat actors are abusing a vulnerability tracked as CVE  in a driver used by Mitel devices that incorporate the TP VoIP interface such as MiVoice Business Express and MiCollab “Dirty Pipe Linux kernel bug lets anyone write to any fileIt s a scary thing when Operating System users can write to any file The impact of such permission can harm the system That s the summary of the bug in two sentences Excerpt from the article He called the vulnerability Dirty Pipe because it involves insecure interaction between a true Linux file  one that s saved permanently on disk and a Linux pipe which is a memory only data buffer that can be used like a file New High Severity UEFI Firmware Flaws Discovered in Millions of HP DevicesThis is scary However a story like this serves as a reminder that No System Is Safe Excerpt from the article The shortcomings which have CVSS scores ranging from to have been uncovered in HP s UEFI firmware The variety of devices affected includes HP s laptops desktops point of sale PoS systems and edge computing nodes Critical RCE Bugs Found in Pascom Cloud Phone System Used by BusinessesIf your business uses the Pascom Cloud Phone System you need to update ASAP Excerpt from the article The set of three flaws includes those stemming from an arbitrary path traversal in the web interface a server side request forgery SSRF due to an outdated third party dependency CVE and a post authentication command injection using a daemon service exd pl In other words the vulnerabilities can be stringed in a chain like fashion to access non exposed endpoints by sending arbitrary GET requests to obtain the administrator password and then use it to gain remote code execution using the scheduled task Intel AMD Arm warn of new speculative execution CPU bugsThese bugs remind me of Spectre and Meltdown Excerpted from the article Researchers at VUSec detail in a technical report today a new method to bypass all existing mitigations by leveraging what they call Branch History Injection BHI The paper underlines that while the hardware mitigations still prevent unprivileged attackers from injecting predictor entries for the kernel relying on a global history to select the targets creates a previously unknown attack method Support MeI am on a journey to keep you updated about important security stories that affect you and me I ll appreciate your support CreditsCover photo by Debby Hudson on Unsplash That s it for this week I ll see you next Friday In Shaa Allah 2022-03-11 14:28:08
海外TECH DEV Community Typescript Basics: How keyof Works https://dev.to/smpnjn/typescript-basics-how-keyof-works-4kif Typescript Basics How keyof WorksIn Javascript and Typescript we often run into situations where we have an object with a certain set of properties and then another variable which matches one or many keys in that object This can cause all sorts of issues if the keys can be edited For example imagine the following situation let myUser name John Doe email xyz fjolt com age function getUserDetails let ourKeys name red age let constructedObject ourKeys forEach function item constructedObject item myUser item return constructedObject We could imagine that ourKeys may be coming from a database or an API For the purposes of explanation I ve put it in a simple array When we run through each of these keys using our forEach loop we output details from myUser into a new Object called constructedObject The only issue is that red is not a key in myUser but it is one in ourKeys Ideally the keys in ourKeys should match the keys in myUser Resolving mismatching keys using keyofTo resolve the issue described above we use keyof Let s reimagine our code in Typescript Custom TypesWe ll be using custom types today If you re unfamiliar with them read our guide on custom types in Typescript here First let s give our myUser object a custom type type User name string email string age let myUser User name John Doe email xyz fjolt com age Now let s define our type for ourKeys Essentially we want ourKeys to be either name email or age One way we can define this is like so type UserKeys name email age This works in theory but it s not necessarily full proof what if the User type changes down the line and we forget to update UserKeys We ll run into some issues Instead we can write the above line like this which means the exact same thing type UserKeys keyof UserThis means UserKeys is a type which translates to name email age but will always stay in sync with User Now we ve constructed our types let s update our code type User name string email string age number type UserKeys keyof Userlet myUser User name John Doe email xyz fjolt com age function getUserDetails let ourKeys UserKeys name email age let constructedObject Partial ourKeys forEach function item K if typeof myUser item undefined constructedObject item myUser item return constructedObject Now if ourKeys contains red we ll get an error so it should always register the correct keys As you can see keyof makes writing our Typescript slightly easier and reduces the maintenance burden You can read more about typescript here 2022-03-11 14:15:48
海外TECH DEV Community 9 Insanely Helpful Kafka Commands Every Developer Must Know https://dev.to/ahmedgulabkhan/9-insanely-helpful-kafka-commands-every-developer-must-know-1246 Insanely Helpful Kafka Commands Every Developer Must KnowIn this article I m going to list out the most popular Kafka CLI commands you should know as a Developer Before we begin I d recommend you to go over this article in order to get a brief understanding of what Kafka is and how it works If you want to setup Kafka locally you can check this article out which helps you setup both Kafka and Zookeeper locally in just simple steps And for running the below commands I m going to use the same Kafka setup as mentioned in that article Which means that the Kafka version being used here would be built for the Scala version Now without any further delay let s go through the list of commands List all the Kafka topicsbin kafka topics sh list zookeeper localhost Create a topicCreates a Kafka topic named my first kafka topic with partitions and replication factor both set as bin kafka topics sh create topic my first kafka topic zookeeper localhost partitions replication factor For simplicity I have set the partitions and the replication factor as for the topic but you can always play around with this configuration If you don t know what partitions and replication factor mean in the Kafka context I d recommend you to go through this article in order to get a good understanding Describe a topicDescribes the topic mentioned in the commandbin kafka topics sh describe zookeeper localhost topic my first kafka topic Update topic ConfigurationUpdate the configuration of the mentioned topic my first kafka topic in this case Here we are updating the property cleanup policy to be compact the compression type to be gzip and the retention ms to be bin kafka configs sh zookeeper localhost alter entity type topics entity name my first kafka topic add config cleanup policy compact compression type gzip retention ms Delete a topicDeletes the topic mentioned in the commandbin kafka topics sh delete zookeeper localhost topic my first kafka topic Produce messages to a topicOpens a prompt where you can type any message and hit enter to publish it to the mentioned topic bin kafka console producer sh broker list localhost topic my first kafka topicYou can keep typing any number of messages and hit enter after every message to publish all the typed messages sequentially If you re done you can exit using Ctrl C Consume messages from a topicConsume messages from the mentioned topicbin kafka console consumer sh bootstrap server localhost topic my first kafka topicThe above command only starts consuming messages from the topic from the instant the command is executed and none of the previous messages are consumed When you do not specify the Consumer Group for the consumer Kafka automatically creates a random Consumer Group called console consumer for the consumer And every time you run the above command a new random consumer group is created for the consumer Consume messages from the mentioned topic where the consumer belongs to the mentioned Consumer Group my first consumer group in this case bin kafka console consumer sh bootstrap server localhost topic my first kafka topic consumer property group id my first consumer group Consume messages from a topic from the beginningFrom the above point we see that irrespective of whether our consumer belongs to a random consumer group or a consumer group created by us the messages are only consumed from the instant that the consumer starts and no previous messages are consumed The from beginning argument makes sure that when a consumer group is created it starts consuming all the messages from beginning Consume messages from the mentioned topic from the beginningbin kafka console consumer sh bootstrap server localhost topic my first kafka topic from beginningNote Once we run the above command we see that a random consumer group is created and our consumer falls under this random consumer group so the messages are consumed from the beginning Now if we run the above command once again a new random consumer group is created once again and the new consumer falls under this new consumer group and all the messages are consumed again from the beginning If we only want our consumer to consume the messages from the start when we first start the consumer i e the first time the consumer group is created and our consumer joins this consumer group then we have to make sure that we keep using the same consumer group in the above command the next time we start our consumer this way our consumer only consumes messages from the start the first time it starts up and in the case when this consumer is restarted the messages are only consumed from the last committed offset but not from the very beginning So it is important that you specify a consumer group since if it s not specified a random consumer group is created each time the above command is run and the messages are consumed from the very beginning every single timeConsume messages from the mentioned topic where the consumer belongs to the mentioned Consumer Groupbin kafka console consumer sh bootstrap server localhost topic my first kafka topic consumer property group id my first consumer group from beginning List all the Consumer Groupsbin kafka consumer groups sh list bootstrap server localhost More Kafka articles that you can go through Apache Kafka A Basic IntroKafka Partitions and Consumer Groups in mins Simple steps to set up Kafka locally using DockerFollow for the next Kafka blog in the series I shall also be posting more articles talking about Software engineering concepts 2022-03-11 14:15:01
海外TECH DEV Community Аfter several years of development, we are finally launched. B2App No code mobile app builder. https://dev.to/accessible89/after-several-years-of-development-we-are-finally-launched-b2app-no-code-mobile-app-builder-18il Аfter several years of development we are finally launched BApp No code mobile app builder For several years I nurtured the idea of ​​a mobile application builder because I understood that it is very difficult and time consuming to develop mobile applications and for online stores this is a huge plus in sales In total at the end of I finally began to experiment with the interface of the future constructor created the project architecture deleted it and changed it again Somewhere in the middle of the MVP of the project was finally ready and he created only android applications moreover written in Java Until the end of I improved the project written for iOS on SwiftUI and realized that I made a mistake in choosing a technology for android especially after creating it on SwiftUI Since the beginning of the news came out that Jetpack Compose became stable enough for large projects and I completely rewrote the Android version on Jetpack Compose Finally a few days ago we finished working on the landing page connecting payment and editing in the constructor itself and launched What it does BApp Android amp iOS native apps builder without using code for online stores based on Woocommerce Shopify Shops Script and others It will allow you to create a beautiful and multifunctional mobile application in a few clicks Why do we need it Mobile e commerce apps allow you to sell products to your customers more flexibly But their development is long and expensive with the help of BApp you can get it in minutes and without using code for Android and iOS Who is it for For any owners of online stores based on Woocommerce Shopify and other platforms What makes it stand out from the crowd Easy to use pre built templates amp themes the intuitive interface lets you customize colors layouts and more Native Apps with world class UI Build and update App in Real Time no need update app in AppStore What s next for BApp No code mobile app builder for eCommerce The ability to get the source code of the project for Android and iOS in a matter of minutes from the created application in BApp for further use and customization The ability to create not only e commerce projects BApp No code mobile app builder 2022-03-11 14:04:55
Linux OMG! Ubuntu! Desktop Cube @GNOME Extension Now Supports Background Panoramas https://www.omgubuntu.co.uk/2022/03/desktop-cube-gnome-extension-panoramas Desktop Cube GNOME Extension Now Supports Background PanoramasOof the bling tastic Desktop Cube GNOME extension by Simon Schneegans just keeps getting better The latest update to the Compiz inspired effect adds support for skyboxes These immersive background panoramas replace the background behind the workspace switcher in GNOME and above The result an immersive ° esque stage You can find compatible panoramas on sites like polyhaven com hdris Just find a panorama that looks good download the tone mapped JPEG version to use with Desktop Cube then pop open the extension s settings via the Extensions Prefs tool or similar and add it Also new in Desktop Cube is the optional ability to This post Desktop Cube GNOME Extension Now Supports Background Panoramas is from OMG Ubuntu Do not reproduce elsewhere without permission 2022-03-11 14:38:15
金融 RSS FILE - 日本証券業協会 英文開示銘柄一覧 https://www.jsda.or.jp/shijyo/foreign/meigara.html 開示 2022-03-11 15:34:00
金融 金融庁ホームページ 審判期日の予定を更新しました。 https://www.fsa.go.jp/policy/kachoukin/06.html 期日 2022-03-11 16:00:00
金融 金融庁ホームページ 三井製糖(株)との契約締結交渉者役員による内部者取引に対する課徴金納付命令の決定について公表しました。 https://www.fsa.go.jp/news/r3/shouken/20220311-1.html 三井製糖 2022-03-11 16:00:00
金融 金融庁ホームページ (株)ミツバ株式に係る相場操縦に対する課徴金納付命令の決定について公表しました。 https://www.fsa.go.jp/news/r3/shouken/20220311-2.html 相場操縦 2022-03-11 16:00:00
金融 金融庁ホームページ (株)レオパレス21社員からの情報受領者による内部者取引に対する課徴金納付命令の決定について公表しました。 https://www.fsa.go.jp/news/r3/shouken/20220311-3.html 内部者取引 2022-03-11 16:00:00
ニュース BBC News - Home Anthony Russell: Triple killer given whole-life prison sentence https://www.bbc.co.uk/news/uk-england-coventry-warwickshire-60707696?at_medium=RSS&at_campaign=KARANGA final 2022-03-11 14:56:50
ニュース BBC News - Home Mystery drone from Ukraine war crashes in Croatia https://www.bbc.co.uk/news/world-europe-60709952?at_medium=RSS&at_campaign=KARANGA ukrainian 2022-03-11 14:22:00
ビジネス ダイヤモンド・オンライン - 新着記事 習氏、経済政策責任者に腹心の起用を検討 - WSJ発 https://diamond.jp/articles/-/298972 経済政策 2022-03-11 23:17:00
北海道 北海道新聞 NY円、116円後半 https://www.hokkaido-np.co.jp/article/655960/ 外国為替市場 2022-03-11 23:12:00
北海道 北海道新聞 不正入札再発防止へ基本方針 予定価格事前公表 官製談合疑惑の南富良野町 https://www.hokkaido-np.co.jp/article/655922/ 上川管内 2022-03-11 23:05:20
仮想通貨 BITPRESS(ビットプレス) [CoinDesk Japan] 暗号資産ATMに停止命令、英規制当局 https://bitpress.jp/count2/3_9_13107 coindeskjapan 2022-03-11 23:33:36
仮想通貨 BITPRESS(ビットプレス) コインチェック、4/11まで「最大1万円!Coincheckつみたて キャッシュバックキャンペーン」実施 https://bitpress.jp/count2/3_14_13106 coincheck 2022-03-11 23:12:53

コメント

このブログの人気の投稿

投稿時間:2021-06-17 05:05:34 RSSフィード2021-06-17 05:00 分まとめ(1274件)

投稿時間:2021-06-20 02:06:12 RSSフィード2021-06-20 02:00 分まとめ(3871件)

投稿時間:2020-12-01 09:41:49 RSSフィード2020-12-01 09:00 分まとめ(69件)