投稿時間:2021-05-16 04:16:08 RSSフィード2021-05-16 04:00 分まとめ(25件)

カテゴリー等 サイト名等 記事タイトル・トレンドワード等 リンクURL 頻出ワード・要約等/検索ボリューム 登録日
python Pythonタグが付けられた新着投稿 - Qiita pythonでGMLタグ内の要素、テキストを取得する https://qiita.com/popo62520908/items/ea0575be5b309cb920b4 treeETparsehogegmlhogegmlを読み込むroottreegetroothogegmlのrootelementを取得gmlFeatureCollectionsecondparsedcoordinatesPolygonPatchエレメント内にpolygonがあるのでまずはPolygonPatchを取得し、posList内のテキストを取得するforchildinrootiterPolygonPatchltgmlPolygonPatchgtではなく、rootのgmlFeatureCollectionに書かれているgmlのurlを入れる。 2021-05-16 03:07:19
Program [全てのタグ]の新着質問一覧|teratail(テラテイル) 取得したファイルオブジェクトからGoogleドキュメントか判別してテキストを読み込む https://teratail.com/questions/338529?rss=all 取得したファイルオブジェクトからGoogleドキュメントか判別してテキストを読み込む前提・実現したいことGoogleドライブにおいて、取得したfileオブジェクトから、そのファイルがGoogleドキュメントかどうかを判定して、Googleドキュメントだった場合にその中身をテキストとして読み込むにはどういった処理にすればよいでしょうかご教示お願いします。 2021-05-16 03:45:44
Program [全てのタグ]の新着質問一覧|teratail(テラテイル) カラムを一つ固定し、他の全てのカラムのsizeをだしたい(pandas) https://teratail.com/questions/338528?rss=all カラムを一つ固定し、他の全てのカラムのsizeをだしたいpandas固定した一つ以外のカラムそれぞれと、固定した一つのカラムに対するsizeを出したいと思っています。 2021-05-16 03:29:24
Ruby Rubyタグが付けられた新着投稿 - Qiita 草野球の出欠確認Webアプリを作ろう! part.7 https://qiita.com/tomodachi_uec/items/fa13d5ee05071654d4ad 一覧画面⇔詳細表示画面⇔編集画面ようやくそのことに気が付いたので、これも修正する。 2021-05-16 03:12:31
golang Goタグが付けられた新着投稿 - Qiita napalm の 6 倍速いリモートコマンド実行ツールを書きました https://qiita.com/umatare5/items/63d8e0479e88d4e80f09 commandC実行するコマンドを指定します。 2021-05-16 03:36:16
Ruby Railsタグが付けられた新着投稿 - Qiita 【Rails】Herokuにデプロイ後、一部のページで500エラー(Internal Server Error) https://qiita.com/ay4528/items/50430899956d1606a28d 発生している問題ローカル環境では問題なく動くが、PostgreSQLへの対応などは済ませてあるHerokuにデプロイすると、一部のページを開いたときにエラーが出て「Weresorrybutsomethingwentwrong」となる。 2021-05-16 03:27:52
Ruby Railsタグが付けられた新着投稿 - Qiita 草野球の出欠確認Webアプリを作ろう! part.7 https://qiita.com/tomodachi_uec/items/fa13d5ee05071654d4ad 一覧画面⇔詳細表示画面⇔編集画面ようやくそのことに気が付いたので、これも修正する。 2021-05-16 03:12:31
海外TECH DEV Community Working with the RedshiftToS3Transfer operator and Amazon Managed Workflows for Apache Airflow https://dev.to/aws/working-with-the-redshifttos3transfer-operator-and-amazon-managed-workflows-for-apache-airflow-56n9 Working with the RedshiftToSTransfer operator and Amazon Managed Workflows for Apache Airflow IntroductionIn this post I am going to take a look at in more detail the launch post of Amazon Managed Workflows for Apache Airflow MWAA In the post a workflow was created to take some source files in this case my old friends the Movielens data set and then move them into Amazon Athena where they were transformed and then uploaded into Amazon Redshift In this post I am going to add an additional step of taking the tables from that Amazon Redshift database and exporting them to Amazon S a common use case that data engineers are asked to do So we will use Apache Airflow to take a file transform it store it in our data warehouse and then export it to our data lake Let s get started What will you needAn AWS account with the right level of privilegesThe latest up to date aws cli at least version A MWAA environment up and running may I suggest you check out some of my earlier blogsYou will find source code for this post at the usual place my residence over on GitHubCostsWhen I ran this and took a look at my AWS bill it was around for the hours I was playing around putting this blog post together Make sure you cleanup delete all the resources after you have finished Getting StartedThe first thing we need to do is setup the Amazon Redshift cluster To make this easy I have created a CDK app that builds everything you need ├ーapp py├ーfiles│└ーreadme txt├ーmwaa redshift│├ーmwaa redshift stack py│└ーmwaa redshift vpc pyIf you look at the app py file it contains the following configuration options you will need to change for your own environment env EU core Environment region eu west account XXXXXXXXXX props redshiftslocation mwaa redshift blog mwaadag airflow mwaa sg sg feadbf mwaa vpc id vpc cbcec redshiftclustername mwaa redshift clusterxxx I have commented the code but the first parameter redshiftslocation is the name of the NEW S bucket you will create This should not exist or the deployment will fail The next one mwaadag is the location of the MWAA Dags folder the mwaa sg is the name of the security group for your MWAA environment which the deployment will amend to add an additional ingress rule for Redshift and finally mwaa vpc id the VPC id which is used to populate the Redshift subnet group to enable connectivity Finally make sure you adjust your environment details region account to reflect your own environment Once you have changed this values for your own environment you can deploy the stack cdk deploy MWAA Redshift VPC cdk deploy MWAA Redshift ClusterIf the deployment has been successful you should see output which you will use later on MWAA RedShift ClusterOutputs MWAA RedShift Cluster MWAAVPCESG mwaa redshift cluster mwaavperedshiftxxx naroqbokgeMWAA RedShift Cluster RedshiftClusterEndpoint mwaa redshift clusterxxx cqhpqttbcoc eu west redshift amazonaws comMWAA RedShift Cluster RedshiftIAMARN arn aws iam XXXXXXXXXX role MWAA RedShift Cluster mwaaredshiftservicerolend HKOCENNXXXXMWAA RedShift Cluster RedshiftSecretARN arn aws secretsmanager eu west XXXXXXXXXX secret MWAARedshiftClusterSecretB SBoNAJOCWZFN xXxXxXYou can have a look at the Amazon Redshift console if you want and you should see the new cluster ready to go Update permissions for your MWAA environmentNow that the Amazon Redshift cluster has been setup we have a new Amazon S bucket in my demo it is called mwaa redshift blog it should have a folder called files that will be used to download data from the web transform it and then ingest into Amazon Redshift We need to add some additional permissions to the MWAA Execution policy so that it can read write files in the new S bucket we are going to be using to download the files and then later exporting them back In my MWAA environment I amend my policy as follows by adding mwaa redshift blog to the resources MWAA can access Effect Allow Action s Resource arn aws s airflow arn aws s airflow arn aws s mwaa redshift arn aws s mwaa redshift arn aws s mwaa redshift blog arn aws s mwaa redshift blog Whist we are talking about permissions as part of the Redshift cluster deployment a new IAM Role is created you can see the name in the outputs once the CDK app has completed If you take a look at the permissions you will see it looks like the following Version Statement Action s Resource arn aws s mwaa redshift blog arn aws s mwaa redshift blog arn aws s airflow arn aws s airflow Effect Allow We have only given it access to our new folder as well as the MWAA Dags folder We could tighten this up further by removing the specific S actions needed so experiment removing those until you get something that works Uploading and running the movielens DAGNow that we have all the infrastructure ready to go it is time to create our workflow DAG I have modified the original DAG from the blog post slightly and you will need to do a few things before we are ready to go You can find the DAG here movielens redshift py If we take a look at the DAG we can see the following section test http Variable get test http default var undefined download http Variable get download http default var undefined s bucket name Variable get s bucket name default var undefined s key Variable get s key default var undefined redshift cluster Variable get redshift cluster default var undefined redshift db Variable get redshift db default var undefined redshift dbuser Variable get redshift dbuser default var undefined redshift table name Variable get redshift table name default var undefined redshift iam arn Variable get redshift iam arn default var undefined redshift secret arn Variable get redshift secret arn default var undefined athena db Variable get demo athena db default var undefined athena results Variable get athena results default var undefined So we do not have to hard code references we use Apache Airflow variables to store the configuration details This makes this workflow much easier to re purpose In the GitHub repo you will find a file called variables json which when you look at it athena results athena results download http s key files test http aws connection aws redshift demo athena db demo athena db redshift airflow connection redshift default redshift cluster mwaa redshift redshift db mwaa redshift dbuser awsuser redshift table name movie demo redshift iam arn arn aws iam XXXXXXXXXXX role RedShift MWAA Role redshift secret arn arn aws secretsmanager eu west XXXXXXXXXX secret mwaa redshift cluster XXXXX s bucket name mwaa redshift blog You WILL need to modify the last three variables redshift iam arn redshift secret arn and s bucket name using the values that were output as part of the Redshift cluster build Once amended you can then import these into MWAA via the Apache Airflow UI Once you have done this you should have a list of the variables with the values listed MWAA stores these securely in the MWAA metstore database If you prefer you could change the configuration of MWAA to look for variables in AWS Secrets Manager and then manage these values via CDK perhaps for this post I am keeping it simple and just using standard variables through the Apache Airflow UI The rest of the DAG is the same as the blog post and you should deploy this to your DAGS folder via your preferred method I use a very simple CI CD system which you can replicate for yourself in my blog post A simple CI CD system for your Amazon Managed Workflows for Apache Airflow development workflowOnce you have uploaded it you should see it in the main Apache Airflow UI Triggering the DAGWe should be ready to go now From the UI you can turn on enable and then trigger the DAG called movielens refshift and the workflow should take around minutes to complete If the workflow looks all dark green then you should be good If you look at the logs you should see something like logging mixin py INFO Running s on host s lt TaskInstance movielens redshift transfer to redshift T running gt ip eu west compute internal logging mixin py INFO bf a cb afd logging mixin py INFO s mwaa redshift blog athena results join athena tables bf a cb afd clean csv logging mixin py INFO copy movie demo from s mwaa redshift blog athena results join athena tables bf a cb afd clean csv iam role arn aws iam role MWAA RedShift Cluster mwaaredshiftservicerolend WNFQCTTKXXXX CSV IGNOREHEADER logging mixin py INFO ClusterIdentifier mwaa redshift clusterxxx CreatedAt datetime datetime tzinfo tzlocal Database mwaa Id cf ab fa cfc SecretArn arn aws secretsmanager eu west xxxxxxxxx secret MWAARedshiftClusterSecretB wkBphID xxxxx ResponseMetadata RequestId ccc ab ef a addb HTTPStatusCode HTTPHeaders x amzn requestid ccc ab ef a addb content type application x amz json content length date Fri May GMT RetryAttempts python operator py INFO Done Returned value was OK taskinstance py INFO Marking task as SUCCESS dag id movielens redshift task id transfer to redshift execution date T start date T end date T logging mixin py INFO local task job py INFO Task exited with return code If you look at your Amazon SFinally if you look at Queries from the Redshift console you should see a successful query appear The part of the DAG that does the move of the data to Redshift is as follows def s to redshift kwargs ti kwargs task instance queryId ti xcom pull key return value task ids join athena tables print queryId athenaKey s s bucket name athena results join athena tables queryId clean csv print athenaKey sqlQuery copy redshift table name from athenaKey iam role redshift iam arn CSV IGNOREHEADER print sqlQuery rsd boto client redshift data resp rsd execute statement ClusterIdentifier redshift cluster Database redshift db DbUser redshift dbuser SecretArn redshift secret arn Sql sqlQuery print resp return OK In essence we are not using an Apache Airflow operator but some Python code and boto and the redshift data apis Congratulations you have now replicated the original launch blog post for MWAA Now lets take a look at how we move that data to S Uploading and running the movielens s DAGYou can find the DAG here movielens redshift s py If we take a look at the DAG we can see the following section looks familiar We are using the same variables so nothing new to create s bucket name Variable get s bucket name default var undefined s key Variable get s key default var undefined redshift table name Variable get redshift table name default var undefined redshift airflow connection Variable get redshift airflow connection default var undefined aws connection Variable get aws connection default var undefined What we DO need to do however is create an Apache Airflow connection which will be used by this DAG to understand how to connect to the Redshift cluster we created In the first DAG we used boto and called the redshift data apis constructing the information that allowed us to run the unload task In this DAG we are going to use an operator called RedshiftToSTransfer and we see in the very simple DAG unload to S RedshiftToSTransfer task id unload to S schema public table redshift table name s bucket s bucket name s key s key redshift conn id redshift airflow connection unload options CSV aws conn id aws connection In order for this to work we need to create a new connection which contains the details of the Redshift cluster When creating this we will give it a name Conn ID which is how we will refer to it in the code If you look up at the variables we configured this to be redshift default from the variables json entry redshift airflow connection redshift default so we will give it that name For the Conn Type we select Amazon Web Services For the Host we use the Redshift cluster endpoint again this was set in the variables above and is output as part of the CDK app deployment For schema we set this to mwaa as this is the name of the database we created so change if you have deviated from the above For username and password enter awsuser change if you changed yours from the defaults and then for password you will need to retrieve the password from the AWS Secret Manager it will be a randomised string Finally port should be set to It should look a little like thisSave the connection and it you are now ready to go Triggering the DAGFrom the UI you can turn on enable and then trigger the DAG called movielens refshift s If we try and now run the export what happens It looks like it hangs but after a few seconds we see the following in the logs redshift to s operator py INFO Executing UNLOAD command logging mixin py INFO base hook py INFO Using connection to id redshift default Host mwaa redshift clusterxxx cqhpqttXXXX eu west redshift amazonaws com Port None Schema mwaa Login awsuser Password XXXXXXXX extra None taskinstance py ERROR could not connect to server Connection timed out Is the server running on host mwaa redshift clusterxxx cqhpqttbcoc eu west redshift amazonaws com X XXX and accepting TCP IP connections on port This is to be expected The MWAA and Amazon Redshift clusters are in two different VPCs and by default there is no access So what are our options Well we have a few we could create a VPC and then deploy both MWAA and our Amazon Redshift cluster in that VPC and use security groups to control access at the network levelwe could enable Amazon Redshift in Public mode and then use security groups to control who can access at the network levelyou could configure your own networking solution to enable connectivity between the MWAA VPC and the Amazon Redshift VPC for example setting up VPC Peering between the two VPCswe could configure Redshift managed VPC endpointsIn this post I am going to look at the last option to addressing this configuring Amazon Redshift managed VPC endpoints You can dive deeper into this topic by checking out this post Enable private access to Amazon Redshift from your client applications in another VPCConfiguring Amazon Redshift managed VPC endpointsThe first thing we need to do is enable a feature within our Redshift cluster called Cluster Relocation which we can do through the aws cli adjust for your cluster name and the aws region you are in aws redshift modify cluster cluster identifier your cluster name availability zone relocation region your region Which should produce output like the following Cluster ClusterIdentifier mwaa redshift clusterxxx NodeType raxlarge ClusterStatus available ClusterAvailabilityStatus Available MasterUsername awsuser DBName mwaa Endpoint Address mwaa redshift clusterxxx cqhpqttXXXX eu west redshift amazonaws com Port ClusterCreateTime T Z AutomatedSnapshotRetentionPeriod ManualSnapshotRetentionPeriod ClusterSecurityGroups VpcSecurityGroups VpcSecurityGroupId sg acecbfXXX Status active ClusterParameterGroups ParameterGroupName default redshift ParameterApplyStatus in sync ClusterSubnetGroupName mwaa redshift cluster mwaaredshiftclustersubnetsb maiqstqw VpcId vpc fafdXXXX AvailabilityZone eu west b PreferredMaintenanceWindow fri sat PendingModifiedValues ClusterVersion AllowVersionUpgrade true NumberOfNodes PubliclyAccessible false Encrypted true Tags KmsKeyId arn aws kms eu west XXXXXXXXXX key dbf bc b d eebb EnhancedVpcRouting false IamRoles IamRoleArn arn aws iam XXXXXXXXXX role MWAA RedShift Cluster mwaaredshiftserviceroleFEF IJCNHRTMXBN ApplyStatus in sync MaintenanceTrackName current DeferredMaintenanceWindows NextMaintenanceWindowStartTime T Z You can check that this change has taken effect with the following command if you do not use jq then look for the AvailabilityZoneRelocationStatus parameter set to enabled aws redshift describe clusters cluster identifier your cluster name region your region jq r Clusters AvailabilityZoneRelocationStatus And you should get enabled if it is working ok The CDK app will have created a new Subnet group the name of which you can see in the outputs This Subnet group contains the subnet ids for the MWAA VPC so grab that info as you will need it in the next step when creating the VPC Endpoint itself We can setup the VPC Endpoint connection Replace the parameters below with your cluster name the name of your Redshift cluster your aws account the name of your aws account your subnet group the name of the Redshift subnet group that was mentioned above your vpc sg this is the MWAA security groupIf you want you can change the endpoint name when running this it will create one called mwaa redshift endpoint To make this easier if you check the outputs you should see the command line you need to execute as part of the outputs aws redshift create endpoint access cluster identifier your cluster name resource owner your aws account endpoint name mwaa redshift endpoint subnet group name your subnet group vpc security group ids your vpc sg region your region Which should output the following ClusterIdentifier mwaa redshift clusterxxx ResourceOwner SubnetGroupName mwaa redshift cluster mwaavperedshiftcsg naroqbokge EndpointStatus creating EndpointName mwaa redshift endpoint Port VpcSecurityGroups VpcSecurityGroupId sg feadbf Status active It will take around minutes to create this and when finished the output of this will be to create a new VPC Endpoint which our MWAA environment will have access to This will create a new Redshift endpoint which we will need to use to replace the Apache Airflow connection we created earlier on To find this endpoint we use the following command change the mwaa redshift endpoint if you used a different name aws redshift describe endpoint access endpoint name mwaa redshift endpoint jq EndpointAccessList Address Which should display something like mwaa redshift endpoint endpoint amwgdywzgkicyjnwvnc cqhpqttbcoc eu west redshift amazonaws com And we can now update the Connection value in the Apache Airflow UI with this updated Redshift connection value When we try again we can now see that it works progress Alas we get a different error You will see something like the following logging mixin py INFO base hook py INFO Using connection to id redshift default Host mwaa redshift endpoint endpoint amwgdywzgkicyjnwvnc cqhpqttbcoc eu west redshift amazonaws com Port Schema mwaa Login awsuser Password XXXXXXXX extra None logging mixin py INFO dbapi hook py INFO UNLOAD SELECT FROM public movie demo TO s mwaa redshift blog files movie demo with credentials aws access key id ASIAICLROKDMYHWTXM aws secret access key QSCFHRvEIy IVgw aaGxatJHdEhUEtXAGYs CSV taskinstance py ERROR SServiceException The AWS Access Key Id you provided does not exist in our records Status Error InvalidAccessKeyId Rid DXZAZJBARSAHZ ExtRid rTDVAnoEpMfsxPTyeJozEUtdPHaxxwsKKJODdhgbHvEfEGoMcgegKsEhBYqqAMar CanRetry DETAIL error SServiceException The AWS Access Key Id you provided does not exist in our records Status Error InvalidAccessKeyId Rid DXZAZJBARSAHZ ExtRid rTDVAnoEpMfsxPTyeJozEUtdPHaxxwsKKJODdhgbHvEfEGoMcgegKsEhBYqqAMar CanRetry code context Listing bucket mwaa redshift blog prefix files movie demo query location s utility cpp process padbmaster pid It turns out we need to do one more thing to get this working Configuring AWS credentials for RedshiftToSTransferWe now need to configure credentials that the Amazon Redshift cluster will use when running the unload operation We have a couple of options we can store the aws credentials as a json tuple in the Apache Airflow Connectionswe can store the same credentials but use the native integration with AWS Secrets Manager to do the sameTo keep things simple I am going to use the Apache Airflow but will create a follow on post that shows how to do the other You will need to create or use an existing IAM user that will be configured to connect to the Redshift cluster to perform the unload transaction You should create a user with the minimal permissions You will need to have to hand the aws access key id and the aws secret access key as you are going to add these to the Apache Airflow connection document From the Connections in the Apache Airflow UI find the connection document you have configured if you are following along in this example I have used one called aws redshift and configured the DAGs to use this too This is currently empty so all I need to do is add your keys in the following format in the extras field aws access key id XASDASDSADSAFDFDSF aws secret access key DSFDSFDSFDSdsfdsfdskfjklsdjfkldsjf After saving this we can try again Success the workflow should show dark green and when we look at the logs we can see redshift to s operator py INFO Executing UNLOAD command logging mixin py INFO base hook py INFO Using connection to id redshift default Host mwaa redshift endpoint endpoint amwgdywzgkicyjnwvnc cqhpqttbcoc eu west redshift amazonaws com Port Schema mwaa Login awsuser Password XXXXXXXX extra None logging mixin py INFO dbapi hook py INFO UNLOAD SELECT FROM public movie demo TO s mwaa redshift blog files movie demo with credentials aws access key id AKIAICLROKDLGXWX aws secret access key DYSipzrVecvnHnjoow XpvNeQybcoYvbxi CSV redshift to s operator py INFO UNLOAD command complete And if we look at the S bucket we can see the files in parque format that have been exported from Redshift TroubleshootingAs I was testing this out I came across a couple of errors that you may see so thought I would document what the errors were and how I resolved the issue AWS cli versionWhen I was running the aws redshift create endpoint access I got errors and looking at the available options the create endpoint access was not available I was using so upgraded and the problem was resolved S Folder issuesThe CDK application creates the bucket and a folder called files which is used as part of the first DAG I got the following errors standard task runner py INFO Job Subtask check s for key logging mixin py INFO Running s on host s lt TaskInstance movielens redshift check s for key T running gt ip eu west compute internal s key sensor py INFO Poking for key s mwaa redshift blog files s key sensor py INFO Poking for key s mwaa redshift blog files taskinstance py ERROR Snap Time is OUT Even though the folder was their MWAA and specifically this operator could not see it In the end I realised that I had forgotten to add the trailing in the destination key prefix part of CDK applicationdestination key prefix files Once I did that it worked fine Amazon Redshift username password errorWhen triggering the Amazon Redshift to S DAG I got the following error standard task runner py INFO Job Subtask unload to S logging mixin py INFO Running s on host s lt TaskInstance movielens redshift s unload to S T running gt ip eu west compute internal redshift to s operator py INFO Executing UNLOAD command logging mixin py INFO base hook py INFO Using connection to id redshift default Host www endpoint smztahkieuxzeyq cqhpqttbcoc eu west redshift amazonaws com Port Schema mwaa Login awsuser Password XXXXXXXX extra None taskinstance py ERROR FATAL password authentication failed for user awsuser FATAL password authentication failed for user awsuser The resolution was simple I had forgotten to add the password to the connection document in MWAA so all I had to do was obtain the password from AWS Secrets Manager and then store that in the password field Amazon Redshift connection times outWhile I was getting the Redshift managed VPC endpoint setup when I triggered the Amazon Redshift to S DAG the task stayed green running for a while and eventually failed with the following error logging mixin py INFO Running s on host s lt TaskInstance movielens redshift s unload to S T running gt ip eu west compute internal redshift to s operator py INFO Executing UNLOAD command logging mixin py INFO base hook py INFO Using connection to id redshift default Host aaa endpoint bgwhiwhrvuimskvn cqhpqttbcoc eu west redshift amazonaws com Port Schema mwaa Login awsuser Password XXXXXXXX extra None taskinstance py ERROR could not connect to server Connection timed out Is the server running on host aaa endpoint bgwhiwhrvuimskvn cqhpqttbcoc eu west redshift amazonaws com and accepting TCP IP connections on port It took me a while to figure this out but the solution involved a few things setup a Redshift Subnet Group with the subnets from the MWAA environmentenable the MWAA security group to allow inbound Redshift traffic port setup the Redshift managed VPC endpoint setup with the correct environment you need to do this AFTER the subnet group has been setup otherwise only the existing Redshift VPC will appear ConclusionIn this post I have shown you how you can use and integrate Apache Airflow to orchestrate data engineering tasks across a number of AWS services importing data from origin to Amazon S transforming it via Amazon Athena creating tables in Amazon Redshift before exporting it to an Amazon S bucket Clean upMake sure you delete all the resources which you can do quickly by running these commands cdk destroy MWAA Redshift Cluster cdk destroy MWAA Redshift VPCAnd then emptying deleting the Amazon S bucket SurveyPlease complete this very short survey to let me know if you found this useful and how I can make them even better Many thanks 2021-05-15 18:40:31
海外TECH DEV Community VimGore : an interactive game to learn vim https://dev.to/ps173/vimgore-an-interactive-game-to-learn-vim-10kn VimGore an interactive game to learn vimSo I wanted to try MERN stack and thus started out with this idea of creating a game that will provide you with snippets of code which will be corrected by the user using vim mode So this idea was enough to give me motivation to jump into the project state So below is the link to the app It is not fully featured and I think that it is missing lot of stuff but will be happy to get a overall feedback If you have any issues then report in repo For now this is vimgoresource codelive site also this was my first proper full stack project Thanks For Reading ️and keep up the hardwork 2021-05-15 18:13:44
Apple AppleInsider - Frontpage News Epic lawsuit continues, troublesome hires, and Detroit - This Week in Apple https://appleinsider.com/articles/21/05/15/epic-lawsuit-continues-troublesome-hires-and-detroit---this-week-in-apple?utm_medium=rss Epic lawsuit continues troublesome hires and Detroit This Week in AppleThe trial between Epic Games and Apple offered more revelations about the companies and the App Store in court while the rest of the week centered around a controversial hiring iPhone rumors and the Detroit Apple Developer Academy Keep up with all of the Apple newsEvery week AppleInsider posts a large number of articles about Apple including stories and rumors about its products and connected third parties In our video series This Week in Apple we condense the last seven days of stores into a single video Read more 2021-05-15 18:24:24
海外ニュース Japan Times latest articles Tokyo snaps five-game losing streak in rout of Kashiwa https://www.japantimes.co.jp/sports/2021/05/15/soccer/j-league/tokyo-kashiwa-adailton/ Tokyo snaps five game losing streak in rout of KashiwaWith fans increasingly calling for change after five straight losses in the J League s first division FC Tokyo manager Kenta Hasegawa went into Saturday s game 2021-05-16 03:24:43
ニュース BBC News - Home Chelsea 0-1 Leicester: Foxes lift FA Cup for first time after Youri Tielemans stunner https://www.bbc.co.uk/sport/football/57055571 Chelsea Leicester Foxes lift FA Cup for first time after Youri Tielemans stunnerLeicester City lift the FA Cup for the first time as Youri Tielemans stunning long range goal earns the Foxes victory over Chelsea at Wembley 2021-05-15 18:47:14
ニュース BBC News - Home FA Cup Final: Wes Morgan's blushes are spared by VAR as late equaliser is ruled out https://www.bbc.co.uk/sport/av/football/57129641 FA Cup Final Wes Morgan x s blushes are spared by VAR as late equaliser is ruled outIncredible drama after Wes Morgan s last gasp own goal is chalked off by VAR to deny Chelsea a late equaliser in the FA Cup final against Leicester City 2021-05-15 18:36:31
ニュース BBC News - Home Premiership: Leicester 35-29 Harlequins - Genge scores brace as Tigers beat play-off contenders https://www.bbc.co.uk/sport/rugby-union/57088563 Premiership Leicester Harlequins Genge scores brace as Tigers beat play off contendersLeicester Tigers survive a ferocious fightback to check Harlequins play off hopes with a thrilling win at Welford Road 2021-05-15 18:52:44
ニュース BBC News - Home Southampton 3-1 Fulham: Che Adams, Nathan Tella & Theo Walcott earn Saints back-to-back wins https://www.bbc.co.uk/sport/football/57034889 Southampton Fulham Che Adams Nathan Tella amp Theo Walcott earn Saints back to back winsRalph Hasenhuttl called Nathan Tella the biggest talent at Southampton s disposal after the striker scored his first senior goal in Saturday s victory over relegated Fulham 2021-05-15 18:12:59
ビジネス ダイヤモンド・オンライン - 新着記事 ワークマンだけがやっている! 小さく始めて大きな成果が出るまで やりきる仕組み - ワークマン式「しない経営」 https://diamond.jp/articles/-/267800 2021-05-16 03:50:00
ビジネス ダイヤモンド・オンライン - 新着記事 脳がみるみるやる気を出し 情熱的モチベーションが生まれる 「3大習慣」 - スタンフォード式生き抜く力 https://diamond.jp/articles/-/267867 2021-05-16 03:45:00
ビジネス ダイヤモンド・オンライン - 新着記事 朝すぐに仕事をスタートさせるための「ある儀式」 - 深い集中を取り戻せ https://diamond.jp/articles/-/271145 集中力 2021-05-16 03:40:00
ビジネス ダイヤモンド・オンライン - 新着記事 【変わることを恐れない】変化を失敗ととらえる人に伝えたいこと - SHOCK EYEの強運思考 https://diamond.jp/articles/-/271088 2021-05-16 03:35:00
ビジネス ダイヤモンド・オンライン - 新着記事 ビジネスパーソンに必要な「抽象化する力」と「具体化する力」 奥野一成・土井英司スペシャル対談(3) - 先生、お金持ちになるにはどうしたらいいですか? https://diamond.jp/articles/-/270123 2021-05-16 03:30:00
ビジネス ダイヤモンド・オンライン - 新着記事 なぜ、 Google スプレッドシートを使うだけで 10X化するのか? - Google 式10Xリモート仕事術 https://diamond.jp/articles/-/267797 なぜ、Googleスプレッドシートを使うだけでX化するのかGoogle式Xリモート仕事術“日本一のマーケッターの神田昌典氏マーケティングの世界的権威ECHO賞・国際審査員大絶賛初の単著がたちまち刷決定。 2021-05-16 03:25:00
ビジネス ダイヤモンド・オンライン - 新着記事 ビットコインのしくみをビザンチン将軍問題でやさしく理解する - 億万長者だけが知っている教養としての数学 https://diamond.jp/articles/-/269463 億万長者 2021-05-16 03:20:00
ビジネス ダイヤモンド・オンライン - 新着記事 東大卒プロゲーマーが「教えるのがうまい人ほど強い」と断言する納得の理由 - 世界一のプロゲーマーがやっている 努力2.0 https://diamond.jp/articles/-/270187 適応 2021-05-16 03:15:00
ビジネス ダイヤモンド・オンライン - 新着記事 【テレビ「土曜はナニする!?」で話題!】 骨盤矯正より、ろっ骨矯正? リブトレでくびれができる理由 - おうちで簡単くびれ作り リブトレ https://diamond.jp/articles/-/271052 【テレビ「土曜はナニする」で話題】骨盤矯正より、ろっ骨矯正リブトレでくびれができる理由おうちで簡単くびれ作りリブトレ日分週間で、顔、首、アンダーバスト、ウエスト、二の腕が、スッキリほっそり“ろっ骨美人になると、すべてが動き出す運動不足や太りすぎでボディラインがくずれたと思ってダイエットしても、割の人はスタイルがよくなりません。 2021-05-16 03:10:00
ビジネス ダイヤモンド・オンライン - 新着記事 【出口治明学長】 “知の爆発”が起きた 古代ギリシャの哲学者、 ソクラテス、プラトン、アリストテレスが 私たちに教えてくれる「真の教養」とは? - 哲学と宗教全史 https://diamond.jp/articles/-/270328 2021-05-16 03:05:00

コメント

このブログの人気の投稿

投稿時間:2021-06-17 22:08:45 RSSフィード2021-06-17 22:00 分まとめ(2089件)

投稿時間:2021-06-20 02:06:12 RSSフィード2021-06-20 02:00 分まとめ(3871件)

投稿時間:2021-06-17 05:05:34 RSSフィード2021-06-17 05:00 分まとめ(1274件)