投稿時間:2023-03-06 23:36:18 RSSフィード2023-03-06 23:00 分まとめ(36件)

カテゴリー等 サイト名等 記事タイトル・トレンドワード等 リンクURL 頻出ワード・要約等/検索ボリューム 登録日
ROBOT ロボスタ 【速報】NTT/東京大/理研/JST 革新的な量子コンピュータ新技術の成果発表、世界最速43GHzを計測!量子と光通信を融合、量子マルチコアの実現へ「IBM Qを置き換えたい」 https://robotstart.info/2023/03/06/quantum-ntt-tokyo-u.html 【速報】NTT東京大理研JST革新的な量子コンピュータ新技術の成果発表、世界最速GHzを計測量子と光通信を融合、量子マルチコアの実現へ「IBMQを置き換えたい」シェアツイートはてブ日本電信電話株式会社NTT、東京大学、理化学研究所は共同で、最先端の商用光通信テクノロジーを光量子分野に適用させる新技術を開発し、論文を米国科学誌で発表した。 2023-03-06 13:00:54
js JavaScriptタグが付けられた新着投稿 - Qiita Vue3のStore(Pinia)のリアクティブ(reactive)の動きを確認してみた https://qiita.com/yuta-katayama-23/items/3c2f8fb4cbced3b3420a compositionapi 2023-03-06 22:12:01
Ruby Rubyタグが付けられた新着投稿 - Qiita Ruby on Rails チュートリアル 第4章 https://qiita.com/kazuki_hoshi/items/d98a225995bb5709f764 rubyonrails 2023-03-06 22:28:37
Ruby Rubyタグが付けられた新着投稿 - Qiita DockerとRuby on Railsを使った基本的なトラブルシューティング https://qiita.com/NealChambers/items/9065ad5ff252345df74d docker 2023-03-06 22:12:48
Ruby Rubyタグが付けられた新着投稿 - Qiita 主な計算~Ruby~ https://qiita.com/ryuuya0921/items/655e91307eccc3507f07 putsputs 2023-03-06 22:10:52
AWS AWSタグが付けられた新着投稿 - Qiita 【AWS】Codedeployのタイムアウトエラーにより「置き換えタスクセットのデプロイ」が一向に終わらない件の対応方法 https://qiita.com/Ryo-0131/items/fac616e4889f19643b62 awscodedeploy 2023-03-06 22:54:20
AWS AWSタグが付けられた新着投稿 - Qiita エイリアスを使って AWS CLI の Profile 切り替えを楽にしてみた https://qiita.com/engine-x/items/8c18bea8a05e97a20bdc awscli 2023-03-06 22:43:17
Docker dockerタグが付けられた新着投稿 - Qiita DockerとRuby on Railsを使った基本的なトラブルシューティング https://qiita.com/NealChambers/items/9065ad5ff252345df74d docker 2023-03-06 22:12:48
Ruby Railsタグが付けられた新着投稿 - Qiita Ruby on Rails チュートリアル 第4章 https://qiita.com/kazuki_hoshi/items/d98a225995bb5709f764 rubyonrails 2023-03-06 22:28:37
Ruby Railsタグが付けられた新着投稿 - Qiita deviseを導入しよう1 https://qiita.com/ooyama-tetu/items/4bfb33ed8001e110971c devise 2023-03-06 22:15:54
Ruby Railsタグが付けられた新着投稿 - Qiita DockerとRuby on Railsを使った基本的なトラブルシューティング https://qiita.com/NealChambers/items/9065ad5ff252345df74d docker 2023-03-06 22:12:48
海外TECH MakeUseOf The Best Air Purifiers for Small Spaces https://www.makeuseof.com/best-air-purifiers-for-small-spaces/ purifiers 2023-03-06 13:05:16
海外TECH DEV Community How to build a bottom navigation component with Tailwind CSS and Flowbite https://dev.to/themesberg/how-to-build-a-bottom-navigation-component-with-tailwind-css-and-flowbite-e9m How to build a bottom navigation component with Tailwind CSS and FlowbiteIn this tutorial you will learn how to build a bottom navigation UI component using the utility classes from Tailwind CSS and the design system from Flowbite A bottom navigation component can be used as a navigational menu CTA actions and control buttons positioned to the bottom of the page as you scroll down Here s a short video presentation from Twitter of what we will build Zoltán Szőgyényi zoltanszogyenyi New component in the house for Flowbite 🪩Use the Bottom Bar component to show a list of menu items and control buttons for your users positioned at the bottom of the page tailwindcss opensource flowbite PM Mar The component that we will build will be responsive and also support dark mode and here s how it will look in the end Without further ado let s get started Tailwind CSS Bottom NavigationThe first thing we need to do is to set up the basic HTML structure using a lt div gt wrapper and a list of lt button gt or lt a gt elements lt div gt lt div gt lt button type button gt lt span gt Home lt span gt lt button gt lt button type button gt lt span gt Wallet lt span gt lt button gt lt button type button gt lt span gt Settings lt span gt lt button gt lt button type button gt lt span gt Profile lt span gt lt button gt lt div gt lt div gt Proceeding with the next steps we need to position this component using the fixed utility class and set up a high Z index so that it will go over other elements as you scroll down lt div class fixed bottom left z w full h bg white border t border gray gt lt div gt lt button type button gt lt span gt Home lt span gt lt button gt lt button type button gt lt span gt Wallet lt span gt lt button gt lt button type button gt lt span gt Settings lt span gt lt button gt lt button type button gt lt span gt Profile lt span gt lt button gt lt div gt lt div gt Next up let s add a maximum width to the second div inside the component and make it fluid on mobile devices lt div class fixed bottom left z w full h bg white border t border gray gt lt div class grid h full max w lg grid cols mx auto gt lt button type button gt lt span gt Home lt span gt lt button gt lt button type button gt lt span gt Wallet lt span gt lt button gt lt button type button gt lt span gt Settings lt span gt lt button gt lt button type button gt lt span gt Profile lt span gt lt button gt lt div gt lt div gt Let s now style one of the buttons and also add a SVG icon and make sure that you also use the aria hidden true tag for accessibility lt button type button class inline flex flex col items center justify center px hover bg gray group gt lt svg class w h mb text gray group hover text blue fill currentColor viewBox xmlns aria hidden true gt lt path d M a l a L Va ha v a ha va ha v l a l z gt lt path gt lt svg gt lt span class text sm text gray group hover text blue gt Home lt span gt lt button gt Awesome Now let s add three more buttons and apply the dark mode classes as well lt div class fixed bottom left z w full h bg white border t border gray dark bg gray dark border gray gt lt div class grid h full max w lg grid cols mx auto gt lt button type button class inline flex flex col items center justify center px hover bg gray dark hover bg gray group gt lt svg class w h mb text gray dark text gray group hover text blue dark group hover text blue fill currentColor viewBox xmlns aria hidden true gt lt path d M a l a L Va ha v a ha va ha v l a l z gt lt path gt lt svg gt lt span class text sm text gray dark text gray group hover text blue dark group hover text blue gt Home lt span gt lt button gt lt button type button class inline flex flex col items center justify center px hover bg gray dark hover bg gray group gt lt svg class w h mb text gray dark text gray group hover text blue dark group hover text blue fill currentColor viewBox xmlns aria hidden true gt lt path d M a vhVa Hz gt lt path gt lt path clip rule evenodd fill rule evenodd d M Hva ha VzM a ha Ha zm a ha Hz gt lt path gt lt svg gt lt span class text sm text gray dark text gray group hover text blue dark group hover text blue gt Wallet lt span gt lt button gt lt button type button class inline flex flex col items center justify center px hover bg gray dark hover bg gray group gt lt svg class w h mb text gray dark text gray group hover text blue dark group hover text blue fill currentColor viewBox xmlns aria hidden true gt lt path d M a va Va v a VzM a va Va Va VzM a va Va v a Va z gt lt path gt lt svg gt lt span class text sm text gray dark text gray group hover text blue dark group hover text blue gt Settings lt span gt lt button gt lt button type button class inline flex flex col items center justify center px hover bg gray dark hover bg gray group gt lt svg class w h mb text gray dark text gray group hover text blue dark group hover text blue fill currentColor viewBox xmlns aria hidden true gt lt path clip rule evenodd fill rule evenodd d M a zm a zm a A a A z gt lt path gt lt svg gt lt span class text sm text gray dark text gray group hover text blue dark group hover text blue gt Profile lt span gt lt button gt lt div gt lt div gt This is the final code that you need to replicate the preview from the beginning of the article and if you use the dark mode settings from Tailwind CSS then the inverse colors will be activated This component is part of a larger collection of free and open source Tailwind CSS Bottom Navigation components from the Flowbite Library CreditsThis tutorial couldn t have been built without the awesome open source libraries and resources Tailwind CSSFlowbite 2023-03-06 13:34:07
海外TECH DEV Community Synchronize Chrome Extensions state https://dev.to/codegino/syncronize-chrome-extensions-state-2bn0 Synchronize Chrome Extensions stateHow to use custom events to sync state between Chrome Extension instances IntroductionThis is a continuation of the previous article about creating a Chrome Extension using Svelte and Tailwind This article will focus on adding an Options page to the extension The problemWhen an action is performed in one instance of the plugin the data is not synced automatically For example if the user updates the count in one tab the other tabs content does not automatically react to changes Project setup Clone the repo from the previous blogThe easiest way is to use degit npx degit codegino svelte tailwind chrome plugin options my target folder Install dependencies and run the applicaitoncd my target foldernpm installnpm run dev Handling Event CommunicationThis article will focus on using custom events to allow plugin communication Check the api documentation to learn more about handling other events To create our custom events we can use the chrome runtime API The runtime API provides a way to send messages between different parts of the extension Here are the steps that we can follow when creating custom events Trigger an event REQUIRED Add event listener REQUIRED Handle the response from listenersHandle connection errorHandle disconnection Trigger an eventThis step includes two changes First we need to change the function declaration to an async await syntax so it will be easier to read later on Second we need to dispatch a message with some payload The payload could be anything In this example I added a type property to the payload to identify the event in the event listener src components Counter svelte lt script lang ts gt export let count number let message string null const increment gt count const decrement gt count const handleSave gt chrome storage sync set count then gt message Updated setTimeout gt message null const handleSave async gt await chrome storage sync set count message Updated This is the only different behavior await chrome runtime sendMessage count type count updated setTimeout gt message null lt script gt Add an event listenerSince we are using Svelte we can use the onMount hook to add the listener src components Counter svelte lt script lang ts gt import onMount from svelte export let count number let message string null onMount gt chrome runtime onMessage addListener handleMessageEvent function handleMessageEvent request sendResponse if request type count updated count request count const increment gt count const decrement gt count const handleSave async gt await chrome storage sync set count message Updated await chrome runtime sendMessage count type count updated setTimeout gt message null lt script gt After adding the listener we can see that the count is updated in all tabs It is easy verify that it will also work in the popup because we are using the same component Handle the response from listenersIn the event handler we can call a sendResponse function with the payload that we want to send back to the sender In this example I m sending back the change in the count value The sendMessage function returns a promise We can use the await keyword to get the response from the listener In this example I m simply appending the response to the message src components Counter sveltefunction handleMessageEvent request sendResponse if request type count updated sendResponse message from count to request count count request count const handleSave async gt await chrome storage sync set count message Updated await chrome runtime sendMessage count type count updated const res await chrome runtime sendMessage count type count updated message res message setTimeout gt message null The response is now at the end of the success message Handle connection errorIn case we only have one instance of the plugin running the sendMessage function will throw an error Also the success message Updated will always be visible because the code to hide the message will not be reached We can handle the error by wrapping the sendMessage in a try catch block src components Counter svelte const handleSave async gt await chrome storage sync set count message Updated const res await chrome runtime sendMessage count type count updated message res message try const res await chrome runtime sendMessage count type count updated message res message catch error Handle error here console log TODO error setTimeout gt message null Side note Handle connection error with callbackIf you are using a callback for whatever reason you need to explicitly check for the error src components Counter svelteconst handleSave async gt await chrome storage sync set count message Updated chrome runtime sendMessage count type count updated res gt const lastError chrome runtime lastError This conditional check is important to remove the error if lastError Handle error here console log TODO lastError message return message res message setTimeout gt message null Now the error is handled properly and the code to hide the message continues to execute Unsubscribe from the listenerSometimes we need to free up resources In this case we can use the onDestroy hook to remove the listener src components Counter svelteimport onDestroy from svelte Rest of the code onDestroy gt chrome runtime onMessage removeListener handleMessageEvent An alternative is to return a function from the onMount hook This function will be called when the component is destroyed src components Counter svelte Rest of the code onMount gt chrome runtime onMessage addListener handleMessageEvent return gt chrome runtime onMessage removeListener handleMessageEvent To simplify the demo of removing the listener I m removing the listener after seconds the component is mounted src components Counter svelteonMount gt chrome runtime onMessage addListener handleMessageEvent For demo purposes only setTimeout gt chrome runtime onMessage removeListener handleMessageEvent return gt chrome runtime onMessage removeListener handleMessageEvent The other tabs will stop listening count changed event and there will be an error because no one is listening RepositoryCheck the source code here What s next Add a content script Add a background script Add a dev tools page Deploying the extension 2023-03-06 13:32:19
海外TECH DEV Community What is an event-driven architecture and why storing events is important ? https://dev.to/aws-builders/what-is-an-event-driven-architecture-and-why-storing-events-is-important--om2 What is an event driven architecture and why storing events is important In this post we are going to see how to create a serverless analytics architecture based off an event driven architecture This is made possible with the help of services like Kinesis Data Firehose EventBridge Athena Glue Lambda and S To avoid rewriting yet another definition of this type of architecture let s turn to AWS s definition An event driven architecture relies on events to trigger communication between decoupled services This type of architecture is common in modern applications based on microservices An event can be a state change or update such as adding an item to the cart on an e commerce site Events can refer to a state purchased item modified price and entered delivery address or consist of identifiers shipment notification of an order Event driven architectures have three key elements Event producersEvent routersEvent consumersA producer sends an event to the router which filters the events and then passes them on to consumers The producer and consumer services are decoupled which allows them to be scaled updated and deployed independently source Now that you are familiar with this type of architecture or at least understand the principle it is important to understand that events become the heart of our application Without them the application no longer makes sense and it stops functioning as it is no longer receiving input We quickly realize that storing these events is essential The marketing team would like to know how many products were sold in the last few hours days months or years On the other hand architects would like to create a new service that would consume one or more existing events They would therefore like to know the number of events produced in x time If you are also convinced that storing and analyzing these events is essential you re lucky because you ve come to the right place The complete code presented in this article is available here Producing events on AWSThe central service of our architecture is AWS EventBridge the event bus service released by AWS in July This service is widely used and in fact used internaly by other AWS services Config rules for example are triggered by events passing through AWS event buses Each service has its own events that are emitted into EventBridge from the success failure of a Lambda function to a scale up or down in an auto scaling group and a lot of information passes through these buses example for auto scaling events But how do we create our own data bus in EventBridge and then send events into It s simple you can do it directly in the console or using your favorite infrastructure as code tool In this article I will use AWS Cloud Development Kit CDK to deploy my infrastructure the code used will be in TypeScript const businessEventsBus new EventBus app IpponBlogDemo Now that we have created our bus let me tell you about the formalism to use when sending events to EventBridge An event consists of fields time the time and date of our event that we are producing source the source corresponds to the domain from which your event was sent e g sales domain detailType defines the action related to the event If we stick to the perspective of an e commerce store we could have item sold for example in this field detail the content of the event this is where we will put specific fields product name price etc event bus name the name of the bus in which we want to send the event All of these fields are encapsulated in a JSON object and in our case sent via a TypeScript class that uses the AWS SDK to call the EventBridge API import EventBridgeClient PutEventsCommand from opt nodejs node modules aws sdk client eventbridge export default class EventSenderEventBridge private client new EventBridgeClient private busName myBusName send async gt const query new PutEventsCommand Entries EventBusName this busName Source “sales DetailType item sold Detail “name “toothbrush “price “quantity await this client send query How to store our data As I am a huge fan of the serverless world and managed services yes it does make life so much easier having a small wallet I turned to a flexible stack img alt Alt height src dev to uploads s amazonaws com uploads articles nphbvohbizgtj png title S logo on the left and Kinesis Data Firehose logo on the right lt br gt width We will use S to store our data This support is perfect for our use case It s cheap ultra flexible and managed by AWS We have the support for our data we know how to easily produce events Now we need that little glue between our receptacle and the events This glue will be provided by the Kinesis service particularly Kinesis Data Firehose Indeed there is an existing integration between Kinesis Data Firehose and EventBridge with S bucket as the destination First you may not know the Kinesis Data Firehose service Here s the AWS definition Amazon Kinesis Data Firehose is an Extract Transform and Load ETL service that captures transforms and reliably delivers streaming data to data lakes data stores and analytics services It may not be very clear yet so here s a diagram illustrating how the service works We can see on the left the AWS Services that contains the EventBridge service So our events will be transmitted to KDF Kinesis Data Firehose where we can easily transform them using a lambda or tools provided by AWS The result will finally be transmitted to the destination in our case an S bucket How to create and configure a Kinesis Data Firehose streamNow let s move on to the creation of our infrastructure with CDK and I ll try to explain the different fields required when creating our stream First you need to create an IAM role with certain permissions To avoid writing too much code I will only create the role and add an IAM policy to it to show you how to do it const deliveryStreamRole new Role construct props eventTypeUppercased DeliveryStreamRole assumedBy new ServicePrincipal firehose amazonaws com deliveryStreamRole addToPolicy new PolicyStatement resources actions logs CreateLogGroup logs PutLogEvents logs CreateLogStream Now you need to add certain IAM policies depending on your needs For the S service AbortMultipartUpload GetBucketLocation GetObject ListBucket ListBucketMultipartUploads PutObject For the Lambda service if you want to transform your input objects into a format different from that of EventBridge InvokeFunction GetFunctionConfiguration For the Glue service we will see the usefulness of this service later GetDatabase GetTable GetPartition GetTableVersions Permissions are now created It s time to create our S bucket to store our data const destinationBucket new Bucket construct BlogIpponDemoBucket bucketName blog ippon destination repository Let s move on to the big part the creation of our Kinesis Data FireHose KDF stream I will break down the creation into several steps to explain what each of them corresponds to First in the CDK document for the KDF service we need to choose a destination configuration to pass as an object in the constructor Here is a link to the documentation construct propsThere are different destinations that can be found there but the one we are interested in is S more specifically the extendedSDestinationConfiguration object The following code will be the configuration of this object For readability reasons we will add intermediate variables for certain parts Two variables will be needed the first concerns the processingConfiguration field and the second concerns the dataFormatConversionConfiguration field const processingConfiguration enabled true processors type Lambda parameters parameterName LambdaArn parameterValue BlogIpponDemoLambdaTransformationArn type AppendDelimiterToRecord parameters parameterName Delimiter parameterValue n We are now in the processingConfiguration section This section comes into play once the data has passed through the buffer which I will explain later Either we do nothing in which case the data goes directly to our destination or we decide to transform it before storing it In our case our source data is an EventBridge event We would like to be able to transform the source event into something more meaningful something that will make sense when we come to analyze it In this case we will use a small lambda that we will have built It s a very basic lambda that takes JSON as input and transforms it into another JSON as output our business event As a result we have streamlined the source data by removing unnecessary fields The second processor is just there to add a delimiter between each of our records The data is now ready to go to our S bucket const dataFormatConversionConfiguration enabled true inputFormatConfiguration deserializer openXJsonSerDe outputFormatConfiguration serializer parquetSerDe schemaConfiguration catalogId props glueDatabase catalogId roleArn deliveryStreamRole roleArn databaseName props glueDatabaseName tableName props glueTableName Our second variable is dataFormatConversionConfiguration It allows us to configure the format part of data conversion input and output and to define a schema for our data this is where Glue comes in Here our input events use the openXJsonSerDe serializer and for the output in order not to store the data as is we will store it in Apache Parquet format which is a columnar format which will reduce costs We will then use the parquetSerDe deserializer Now it s time to define a schema for our data This schema is contained in a Glue table which itself is contained in a database Here we only specify where our schema is stored part schemaConfiguration It is now time to assemble everything in our configuration object by adding the other fields const deliveryStream new CfnDeliveryStream construct props eventTypeUppercased DeliveryStream extendedSDestinationConfiguration bucketArn props destinationBucket bucketArn roleArn deliveryStreamRole roleArn bufferingHints intervalInSeconds cloudWatchLoggingOptions enabled true logGroupName kinesis log group logStreamName kinesis logs prefix year timestamp yyyy month timestamp MM day timestamp dd custom partition partitionKeyFromLambda custom partition errorOutputPrefix errors firehose error output type processingConfiguration processingConfiguration dataFormatConversionConfiguration dataFormatConversionConfiguration dynamicPartitioningConfiguration enabled true retryOptions durationInSeconds The first two fields correspond to the ARN of the destination S bucket and the ARN of the IAM role to use The bufferingHints field is interesting as it allows us to configure the management of our stream buffer The buffer corresponds to either the data retention time or the minimum amount of data to be sent Two fields are available intervalInSeconds which corresponds to the time part how long to wait The value of this field can range from s to s and defaults to s we wait at least minute between each send and up to minutes Then we have the sizeInMBs field which corresponds to the amount of data we should wait for before sending This amount varies between and with a default value of We now move on to the part concerning the logs Here we have a cloudWatchLoggingOptions object that allows us to activate logs in CloudWatch by setting the name of the CloudWatch Group and Stream We now have one of the crucial parts to define our stream the prefix to use prefix to know where to send our data in our S bucket and especially with which partitions Here I wanted to have a separation by datetime with a compartment per year month day and then by a custom partition that we will see later We use a partitioning mode based on Hive to take advantage of dynamic partitioning It s just necessary to know that we need to define our partition keys with this formalism partition key We retrieve the key here We do the same for the output of processing or sending errors with the errorOutputPrefix keyword The fact that errors are sent directly to the S bucket is super convenient we can easily see if there are errors We can also automate the detection of these errors by building a lambda that will be triggered when a file is added to this error compartment To finish creating our stream we will activate and configure dynamic partitioning using the dynamicPartitioningConfiguration keyword Dynamic partitioning is recent and offers many advantages The first is to be able to query our data according to keys that we define ourselves These keys will allow us to build SQL queries with WHERE clauses This will allow Athena the service we will use later to query to only pick the data we really care about and therefore not scan unnecessary data So we will have an improvement in performance cost and the number of scanned data The downside is that this option is recent and not necessarily well documented at the moment How to create and configure a Glue table to use with KDFHere the use of Glue that we make is based on the definition of a data schema in a database It should be noted that Glue is a serverless data integration service that allows analytics users to easily discover prepare move and integrate data from multiple sources So it is much more complete and complex than the use we make of it here We are almost there just two more steps and we can finally query our business events from EventBridge Let s move on to the creation of a Glue database and table const glueDatabase new CfnDatabase scope glue database catalogId IpponBlogDemoAwsAccountId databaseInput name IpponBlogDemoGlueDatabase There s no need to explain this step as the code is pretty basic with what you have seen in the article by now For the creation of the table the interesting configuration will be located in the tableInput object Since this object is quite large we will have to use a few intermediate variables These variables will be partitionKeys and columns const partitionKeys name year type string name month type string name day type string name custom partition type string This first object allows us to define partition keys These keys correspond to what we put in the prefix section when configuring our stream earlier The custom partition field here is just to remind us of the partition that we added thanks to the transformation lambda const columns name name comment Name of the item type string name price comment Price of the item type string name quantity comment Quantity of item type string This field will allow us to define the schema that our data will have in our S bucket Here we will have a name a price and a quantity for each item We can now assemble everything in our CDK object const glueTable new CfnTable scope glue table for athena databaseName IpponBlogDemoGlueDatabase catalogId IpponBlogDemoAwsAccountId tableInput name IpponBlogDemoGlueTable tableType EXTERNAL TABLE partitionKeys partitionsKeys storageDescriptor compressed true inputFormat org apache hadoop hive ql io parquet MapredParquetInputFormat outputFormat org apache hadoop hive ql io parquet MapredParquetOutputFormat serdeInfo serializationLibrary org apache hadoop hive ql io parquet serde ParquetHiveSerDe columns columns location IpponBlogDemoSBucketUrl The storageDescriptor field will allow us to describe how the data will be stored Here we will specify that our data will be compressed our input format will be Apache Parquet and our output format will also be Apache Parquet As I am not an expert in this area I will not go into the details of this configuration but it has the merit of working We will simply need to define the location of our data where it is stored using the location keyword we will pass in the URL of our S bucket We still need to describe to Glue what these partition keys correspond to i e the associated type possible values and the number of digits possible for example To do this we will use the addPropertyOverride function on the table we created earlier glueTable addPropertyOverride TableInput Parameters projection enabled true glueTable addPropertyOverride TableInput Parameters projection year type integer glueTable addPropertyOverride TableInput Parameters projection year range glueTable addPropertyOverride TableInput Parameters projection year digits glueTable addPropertyOverride TableInput Parameters projection month type integer glueTable addPropertyOverride TableInput Parameters projection month range glueTable addPropertyOverride TableInput Parameters projection month digits glueTable addPropertyOverride TableInput Parameters projection day type integer glueTable addPropertyOverride TableInput Parameters projection day range glueTable addPropertyOverride TableInput Parameters projection day digits glueTable addPropertyOverride TableInput Parameters projection custom partition type injected glueTable addPropertyOverride TableInput Parameters storage location template s ippon blog demo analytics repository year year month month day day custom partition custom partition In our case we specify that the year partition is a digit integer between and We repeat the process for months and days and let Glue define the type of our custom partition itself using the injected keyword It s very convenient but has a big disadvantage we can only perform equalities in SQL queries on this field afterwards which can be annoying when we want to retrieve information for a specific period and not just a fixed date for example And there you have it the job is done How to query data in S We now have a functional stack that allows us to retrieve events sent to EventBridge transform them into a business oriented format and send them to an S bucket with beautiful partitions The only thing left to do is to query all this data When it comes to analytics and S we often think of the AWS Athena service Athena is a service that allows you to easily query an S bucket and many other data sources using SQL language and Apache Spark under the hood In addition to that Athena is a completely managed service by AWS no infrastructure management required and that s awesome The best part of the story You don t need to do anything once you get to the Athena console simply choose your Glue database select the table in question configure the destination for query results and you ll be ready to query your data What s magical is that thanks to storing our data in the parquet format combined with dynamic data partitioning this entire stack will cost you very little money Unless of course you produce a huge number of events because yes Athena can be expensive And when I say huge I really mean a lot On a personal project I m at around events per day and the stack doesn t cost me a penny with AWS s free plan The query over days or m events represents MB of scanned data Considering that Athena s price is € per terabyte of scanned data this query represents a little less than cents Of course it all depends on the size of your events In conclusion it is possible to optimize performance when querying with Athena Indeed the fewer files we have in our S destination bucket the faster Athena will execute the query It is therefore possible with the help of an EMR Elastic MapReduce job to join the files together to have fewer but larger files In terms of costs you also need to be careful with the Athena query result bucket and not hesitate to set lifecycle configurations on the buckets to expire objects that you no longer need queries can quickly reach gigabytes of results You should also think carefully about what you want to store Storing just for the sake of storing is not very useful either for the business or for the wallet Only store what can add value to the queries you are going to make the rest is certainly not useful for analytics 2023-03-06 13:31:39
海外TECH DEV Community Applying for a new job with React and NodeJS and AI https://dev.to/novu/applying-for-a-new-job-with-react-and-nodejs-and-ai-17a9 Applying for a new job with React and NodeJS and AI TL DRIn the previous article in the series I walked you through how to build a resume builder application that accepts some specific information from the user and creates a printable resume In this article we ll take the application one step further by adding a cold emailing feature that allows users to send job applications containing a winning cover letter and a resume using the OpenAI API and EmailJS Why do you need it I have been a programmer for the last decade but when it comes to showing my skills on paper marketing I am not the person for it I got filtered before in my life for many jobs just because of my resume I haven t even gotten into the interview We are going to change that today with GPT Project Setup and InstallationClone the  GitHub repository for the project here Run npm install to install the project s dependencies Log in or create an OpenAI account here Click Personal on the navigation bar and select View API keys from the menu bar to create a new secret key Add your OpenAI API key within the index js file Create a component called SendResume js where users will provide the data required for sending the cold emails cd client src componentstouch SendResume jsRender the SendResume component via its own route with React Router import React useState from react import BrowserRouter Routes Route from react router dom import Home from components Home import Resume from components Resume imports the componentimport SendResume from components SendResume const App gt const result setResult useState return lt div gt lt BrowserRouter gt lt Routes gt lt Route path element lt Home setResult setResult gt gt lt Route path resume element lt Resume result result gt gt displays the component lt Route path send resume element lt SendResume gt gt lt Routes gt lt BrowserRouter gt lt div gt export default App Update the Home js component to render a link that navigates the user to the SendResume component at the top of the page const Home gt other code statements return lt gt lt div className buttonGroup gt lt button onClick handlePrint gt Print Resume lt button gt lt Link to send resume className sendEmail gt Send via Email lt Link gt lt div gt other UI elements lt gt Add the code snippet below into the src index css file buttonGroup padding px width margin auto display flex align items center justify content space between position sticky top background color fff sendEmail background color f padding px text decoration none border radius px resume title margin bottom px companyDescription padding px border px solid eee border radius px margin bottom px sendEmailBtn width px nestedItem display flex flex direction column width nestedItem gt input width Building the application user interfaceUpdate the SendResume component to display the required form fields as done below import React useState from react const SendResume gt const companyName setCompanyName useState const jobTitle setJobTitle useState const companyDescription setCompanyDescription useState const recruiterName setRecruiterName useState const recruiterEmail setRecruiterEmail useState const myEmail setMyEmail useState const resume setResume useState null const handleFormSubmit e gt e preventDefault console log Submit button clicked return lt div className app gt lt h className resume title gt Send an email lt h gt lt form onSubmit handleFormSubmit encType multipart form data gt lt div className nestedContainer gt lt div className nestedItem gt lt label htmlFor recruiterName gt Recruiter s Name lt label gt lt input type text value recruiterName required onChange e gt setRecruiterName e target value id recruiterName className recruiterName gt lt div gt lt div className nestedItem gt lt label htmlFor recruiterEmail gt Recruiter s Email Address lt label gt lt input type email value recruiterEmail required onChange e gt setRecruiterEmail e target value id recruiterEmail className recruiterEmail gt lt div gt lt div gt lt div className nestedContainer gt lt div className nestedItem gt lt label htmlFor myEmail gt Your Email Address lt label gt lt input type email value myEmail required onChange e gt setMyEmail e target value id myEmail className myEmail gt lt div gt lt div className nestedItem gt lt label htmlFor jobTitle gt Position Applying For lt label gt lt input type text value jobTitle required onChange e gt setJobTitle e target value id jobTitle className jobTitle gt lt div gt lt div gt lt label htmlFor companyName gt Company Name lt label gt lt input type text value companyName required onChange e gt setCompanyName e target value id companyName className companyName gt lt label htmlFor companyDescription gt Company Description lt label gt lt textarea rows className companyDescription required value companyDescription onChange e gt setCompanyDescription e target value gt lt label htmlFor resume gt Upload Resume lt label gt lt input type file accept pdf doc docx required id resume className resume onChange e gt setResume e target files gt lt button className sendEmailBtn gt SEND EMAIL lt button gt lt form gt lt div gt export default SendResume The code snippet accepts the company s name and description job title recruiter s name the user s email and resume It only accepts resumes that are in PDF or Word format All the data are necessary for creating an excellent and well tailored cover letter In the upcoming sections I ll guide you through generating the cover letter and emailing them Import the Loading component and the React Router s useNavigate hook into the SendResume js file import Loading from Loading import useNavigate from react router dom const SendResume gt const loading setLoading useState false const navigate useNavigate if loading return lt Loading gt other code statements The code snippet above displays the Loading page when the POST request is still pending Stay with me as I walk you through the logic Update the handleFormSubmit function to send all the form inputs to the Node js server const handleFormSubmit e gt e preventDefault form object const formData new FormData formData append resume resume resume name formData append companyName companyName formData append companyDescription companyDescription formData append jobTitle jobTitle formData append recruiterEmail recruiterEmail formData append recruiterName recruiterName formData append myEmail myEmail imported function sendResume formData setLoading navigate states update setMyEmail setRecruiterEmail setRecruiterName setJobTitle setCompanyName setCompanyDescription setResume null From the code snippet above I added all the form inputs file and texts into a JavaScript FormData object The sendResume function accepts the form data the setLoading and the navigate variable as parameters Next let s create the function Create a utils folder containing a util js file within the client src folder cd client srcmkdir utilscd utilstouch util jsCreate the sendResume function within the util js file as done below We ll configure the POST request within the sendResume function const sendResume formData setLoading navigate gt export default sendResume Import Axios already installed and send the form data to the Node js server as done below import axios from axios const sendResume formData setLoading navigate gt setLoading true axios post http localhost resume send formData then res gt console log Response res catch err gt console error err The function sets the loading state to true displays the Loading component when the request is pending then makes a POST request to the endpoint on the server Create the endpoint on the server as done below app post resume send upload single resume async req res gt const recruiterName jobTitle myEmail recruiterEmail companyName companyDescription req body log the contents console log recruiterName jobTitle myEmail recruiterEmail companyName companyDescription resume http localhost uploads req file filename The code snippet above accepts the data from the front end and uploads the resume to the server via Multer The resume property from the object contains the resume s URL on the server Generating the cover letter via the OpenAI APIIn the previous article we accepted the user s work experience name and list of proficient technologies Make these variables global to enable the resume send endpoint to access its contents let workArray let applicantName let technologies app post resume create upload single headshotImage async req res gt const fullName currentPosition currentLength currentTechnologies workHistory req body workArray JSON parse workHistory applicantName fullName technologies currentTechnologies other code statements From the code snippet the workArray applicantName and technologies are the global variables containing the user s work experience full name and skills These variables are required when creating the cover letter Update the resume send endpoint as done below app post resume send upload single resume async req res gt const recruiterName jobTitle myEmail recruiterEmail companyName companyDescription req body const prompt My name is applicantName I want to work for companyName they are companyDescription I am applying for the job jobTitle I have been working before for remainderText And I have used the technologies such ass technologies I want to cold email recruiterName my resume and write why I fit for the company Can you please write me the email in a friendly voice not offical without subject maximum words and say in the end that my CV is attached const coverLetter await GPTFunction prompt res json message Successful data cover letter coverLetter recruiter email recruiterEmail my email myEmail applicant name applicantName resume http localhost uploads req file filename From the code snippet above We destructured all the data accepted from the React app and created a prompt for the AI using the data The prompt is then passed into the OpenAI API to generate an excellent cover letter for the user All the necessary data required for sending the email are then sent as a response back to the React app The resume variable holds the document s resume URL Recall that from the previous tutorial the ChatGPTFunction accepts a prompt and generates an answer or response to the request const GPTFunction async text gt const response await openai createCompletion model text davinci prompt text temperature max tokens top p frequency penalty presence penalty return response data choices text Sending emails via EmailJS in ReactHere I ll guide you through adding EmailJS to the React js application and how to send the AI generated email to recruiters Install EmailJS to the React application by running the code below npm install emailjs browserCreate an EmailJS account here and add an email service provider to your account Create an email template as done in the image below The words in curly brackets represent variables that can hold dynamic data Import the EmailJS package into the util js file and send the email to the recruiter s email import emailjs from emailjs browser axios post http localhost resume send formData then res gt if res data message const cover letter recruiter email my email applicant name resume res data data emailjs send lt SERVICE ID gt lt TEMPLATE ID gt cover letter applicant name recruiter email my email resume lt YOUR PUBLIC KEY gt then res gt if res status setLoading false alert Message sent navigate catch err gt console error err catch err gt console error err The code snippet checks if the request was successful if so it de structures the data from the response and sends an email containing the cover letter and resume to the recruiter The variables created within the template on your EmailJS dashboard are replaced with the data passed into the send function The user is then redirected to the home page and notified that the email has been sent Congratulations You ve completed the project for this tutorial Here is a sample of the email delivered ConclusionSo far in this series you ve learnt what OpenAI GPT is how to upload images via forms in a Node js and React js application how to interact with the OpenAI GPT API how to print React web pages via the React to print library andhow to send emails via EmailJS in React This tutorial walks you through an example of an application you can build using the OpenAI API A little upgrade you can add to the application is authenticating users and sending the resume as attachments instead of URLs PS Sending attachments via EmailJS is not free You ll have to subscribe to a payment plan The source code for this tutorial is available here Thank you for reading A small request We are Novu an open source notification infrastructure service that can manage all your notifications channels Email SMS Push Notifications Chat In App You can use Novu to automate all of your channels together in a simple dashboard I produce content weekly your support helps a lot to create more content Please support me by starring our GitHub library Thank you very very much ️️️ 2023-03-06 13:13:35
Apple AppleInsider - Frontpage News Why refurbished iPhones are the smart choice for budget-conscious buyers https://appleinsider.com/articles/23/03/06/why-refurbished-iphones-are-the-smart-choice-for-budget-conscious-buyers?utm_medium=rss Why refurbished iPhones are the smart choice for budget conscious buyersThe focus of most smartphone buyers is on the latest models but a refurbished iPhone can be a better choice for many who seek out an upgraded device You could get a new iPhone but a reconditioned iPhone could be just as good and cheaper too Apple and other smartphone producers consistently push for consumers to upgrade to the latest releases with large numbers doing so every couple of years With frequent updates consumers get the latest features while phone makers earn a hefty chunk of change in return Read more 2023-03-06 13:53:02
Apple AppleInsider - Frontpage News iPhone SE 4 may get screens from Apple's newest OLED supplier https://appleinsider.com/articles/23/03/06/iphone-se-4-may-get-screens-from-apples-newest-oled-supplier?utm_medium=rss iPhone SE may get screens from Apple x s newest OLED supplierDisplay producer BOE may still end up producing OLED screens for Apple s latest models with it apparently set to supply the panels for the fourth generation iPhone SE iPhone SEChinese manufacturer BOE has a history of making iPhone display panels but also having a troubled relationship with Apple concerning the quality of its components Though it has reportedly failed to be added to the supply chain for a rumored OLED iPad it could still be picked to provide panels for another device Read more 2023-03-06 13:03:48
Cisco Cisco Blog Navigating Economic and Cybersecurity Threats in 2023 https://feedpress.me/link/23532/16006578/navigating-economic-and-cybersecurity-threats-in-2023 breach 2023-03-06 13:00:49
金融 RSS FILE - 日本証券業協会 新型コロナウイルス感染症への対応について https://www.jsda.or.jp/shinchaku/coronavirus/index.html 新型コロナウイルス 2023-03-06 15:00:00
ニュース @日本経済新聞 電子版 ChatGPT、米国の学校に波紋 「思考奪う」「新潮流」 https://t.co/7vTacYuuiO https://twitter.com/nikkei/statuses/1632741816922669056 chatgpt 2023-03-06 13:55:58
ニュース @日本経済新聞 電子版 https://t.co/5cy4Q7Azxw 米軍が駐留しない戦時下の国にアメリカ大統領が入るのはほぼ前例がなく、肩入れしたウクライナが負ければ「米国の敗北」とみなされる――。ロシア侵攻の出口を考えます。… https://t.co/9DFU4LcjgG https://twitter.com/nikkei/statuses/1632736935088201730 米軍が駐留しない戦時下の国にアメリカ大統領が入るのはほぼ前例がなく、肩入れしたウクライナが負ければ「米国の敗北」とみなされるー。 2023-03-06 13:36:34
ニュース @日本経済新聞 電子版 ロシアで金の個人需要急増、地金など5倍 インフレ懸念 https://t.co/3KAO9BB78k https://twitter.com/nikkei/statuses/1632736034076385280 需要 2023-03-06 13:32:59
ニュース @日本経済新聞 電子版 大谷翔平、衝撃の3ラン2発 WBC強化試合 https://t.co/UIIO4sHMPR https://twitter.com/nikkei/statuses/1632734897440976896 強化試合 2023-03-06 13:28:28
ニュース @日本経済新聞 電子版 小麦値上げ5%程度に抑制 4月以降の政府売り渡し価格 https://t.co/rt9AEzj18j https://twitter.com/nikkei/statuses/1632731976087269377 売り渡し 2023-03-06 13:16:52
ニュース @日本経済新聞 電子版 領空に偵察気球、日本の対応は? 米国で撃墜1カ月 https://t.co/phT8FTCzPr https://twitter.com/nikkei/statuses/1632729638404521985 領空 2023-03-06 13:07:35
海外ニュース Japan Times latest articles Shohei Ohtani thrills fans with two homers as Japan beats Hanshin in WBC warmup https://www.japantimes.co.jp/sports/2023/03/06/baseball/ohtani-two-homers-japan-win/ tigers 2023-03-06 22:08:55
ニュース BBC News - Home Cardiff car crash: Three dead after missing people search https://www.bbc.co.uk/news/uk-64859195?at_medium=RSS&at_campaign=KARANGA early 2023-03-06 13:53:46
ニュース BBC News - Home Plan for lifetime ban for Channel migrants is unworkable, say charities https://www.bbc.co.uk/news/uk-64848101?at_medium=RSS&at_campaign=KARANGA proposals 2023-03-06 13:16:04
ニュース BBC News - Home Charles Bronson would not cope with release, parole panel told https://www.bbc.co.uk/news/uk-england-beds-bucks-herts-64861518?at_medium=RSS&at_campaign=KARANGA goldilocks 2023-03-06 13:32:46
ニュース BBC News - Home CBI boss Tony Danker steps aside after misconduct allegations https://www.bbc.co.uk/news/business-64861370?at_medium=RSS&at_campaign=KARANGA danker 2023-03-06 13:36:39
ニュース BBC News - Home Watch: Ceiling collapses, nearly hits commuter in US https://www.bbc.co.uk/news/world-us-canada-64865242?at_medium=RSS&at_campaign=KARANGA inspect 2023-03-06 13:12:33
ニュース BBC News - Home Wayne Couzens sentenced over indecent exposures https://www.bbc.co.uk/news/uk-england-london-64860324?at_medium=RSS&at_campaign=KARANGA everard 2023-03-06 13:29:00
ニュース BBC News - Home Bangladesh v England: Tigers win by 50 runs to stop ODI series clean sweep https://www.bbc.co.uk/sport/cricket/64859360?at_medium=RSS&at_campaign=KARANGA chittagong 2023-03-06 13:54:04
ニュース BBC News - Home Blackpool attack: Football fan dies after post-match brawl at pub https://www.bbc.co.uk/news/uk-england-lancashire-64861231?at_medium=RSS&at_campaign=KARANGA flowers 2023-03-06 13:22:54
海外TECH reddit A crazy interaction happened when I confronted neighbor about burning trash. What to do?. https://www.reddit.com/r/japanlife/comments/11jybxo/a_crazy_interaction_happened_when_i_confronted/ A crazy interaction happened when I confronted neighbor about burning trash What to do So I posted a week or so ago Neighbor burns trash the same time I hang the kids laundry We live in the inaka so police can t really help So I went and knock on there door Grandma came out and was nice even spoke some English Explained the situation and thought we were good Then dad comes out Total Yankee my Japanese is all right but his is so fast and crazy Got the wife tho so not a problem Basically we ask to stop burning trash at night cause we are hanging laundry They have a small kid so they should understand Grandma who owns the home understands and says sorry Dad he says he will kill our cat and that he can t park his car with my car on the street From here things got heated First of all I said just call the cops I am allowed to park there I went to driving school I know the laws But threatening to kill my cat Come on bruh I should report this to the police tomorrow right submitted by u qwertyqyle to r japanlife link comments 2023-03-06 13:02:07

コメント

このブログの人気の投稿

投稿時間:2021-06-17 05:05:34 RSSフィード2021-06-17 05:00 分まとめ(1274件)

投稿時間:2021-06-20 02:06:12 RSSフィード2021-06-20 02:00 分まとめ(3871件)

投稿時間:2020-12-01 09:41:49 RSSフィード2020-12-01 09:00 分まとめ(69件)