投稿時間:2022-04-05 03:35:10 RSSフィード2022-04-05 03:00 分まとめ(35件)

カテゴリー等 サイト名等 記事タイトル・トレンドワード等 リンクURL 頻出ワード・要約等/検索ボリューム 登録日
AWS AWS Marketplace Securing your AWS Control Tower multi-account environment with Lacework https://aws.amazon.com/blogs/awsmarketplace/securing-your-aws-control-tower-multi-account-environment-with-lacework/ Securing your AWS Control Tower multi account environment with LaceworkFor enterprise organizations managing security and governance across hundreds or thousands of accounts can be challenging AWS Control Tower and Lacework make this task much easier and enable seamless multi account cloud security By using Lacework in your AWS Control Tower environment you can automatically and consistently apply security best practices and monitoring to new accounts … 2022-04-04 17:44:59
AWS AWS Database Blog Use parallelism to optimize querying large amounts of data in Amazon DynamoDB https://aws.amazon.com/blogs/database/use-parallelism-to-optimize-querying-large-amounts-of-data-in-amazon-dynamodb/ Use parallelism to optimize querying large amounts of data in Amazon DynamoDBIn this post I demonstrate how to optimize querying a large amount of data in Amazon DynamoDB by using parallelism splitting the original query into multiple parallel subqueries to meet these strict performance SLAs for large DynamoDB database queries During our engagements with customers we often need to retrieve a large number of … 2022-04-04 17:09:32
AWS AWS AWS Sports is Changing the Game | Amazon Web Services https://www.youtube.com/watch?v=gMj8UGtjpXg AWS Sports is Changing the Game Amazon Web ServicesThe world s best sports organizations are using AWS to build data driven solutions and reinvent the way sports are watched played and managed AWS provides cloud services that are at the core of sports innovation athlete optimization and epic fandom Machine learning ML is producing advanced stats like NFL s Next Gen Stats high performance computing HPC is redesigning F race cars and America s Cup boats organizations like NASCAR are using artificial intelligence AI services to automatically tag and categorize archives and Internet of Things IoT is connecting sensors from the field and track to deliver real time data Whether it s predicting the probability of a catch in real time or forecasting ticket sales after a winning season technology is changing the game And AWS is how Learn more about AWS Sports at Subscribe More AWS videos More AWS events videos ABOUT AWSAmazon Web Services AWS is the world s most comprehensive and broadly adopted cloud platform offering over fully featured services from data centers globally Millions of customers ーincluding the fastest growing startups largest enterprises and leading government agencies ーare using AWS to lower costs become more agile and innovate faster AWSSports AWS AmazonWebServices CloudComputing 2022-04-04 17:16:33
海外TECH Ars Technica Google Play crackdown makes Amazon, Barnes & Noble pull digital purchases https://arstechnica.com/?p=1845567 google 2022-04-04 17:11:59
海外TECH Ars Technica Elon Musk buys 9% of Twitter stock as he pressures company on “free speech” https://arstechnica.com/?p=1845637 board 2022-04-04 17:02:40
海外TECH MakeUseOf How to Back Up Your PlayStation 5 Data https://www.makeuseof.com/how-to-back-up-data-ps5/ captures 2022-04-04 17:45:13
海外TECH MakeUseOf 5 Websites to Create Free Customized Video Resumes https://www.makeuseof.com/websites-to-create-free-video-resume/ templates 2022-04-04 17:30:13
海外TECH DEV Community Web Components 101: Framework Comparison https://dev.to/this-is-learning/web-components-101-framework-comparison-989 Web Components Framework ComparisonAlright alright I know for a lot of the last article seemed like a big ad for Lit That said I promise I m not unable to see the advantages of other frameworks Lit is a tool in a web developer s toolbox Like any tool it has its pros and cons times when it s the right tool for the job and other times when it s less so That said I d argue that using an existing framework is more often the better tool for the job than vanilla web components To showcase this let s walk through some of these frameworks and compare and contrast them to home growing web components Pros and Cons of Vanilla Web ComponentsWhile web frameworks are the hot new jazz it s not like we couldn t make web applications before them With the advent of WC standardized web components without Lit doing so today is better than it s ever been Here are some pros and cons of Vanilla JavaScript web components ProsNo framework knowledge requiredLess reliance on frameworkMaintenanceBugsSecurity issuesSmaller “hello world sizeMore control over render behaviorConsRe rendering un needed elements is slowHandling event passing is trickyCreating elements can be overly verboseBinding to props requires element queryYou ll end up building Lit anywayTo the vanilla way of doing things credit there s a bit of catharsis knowing that you re relying on a smaller pool of upstream resources There s also a lessened likelihood of some bad push to NPM from someone on the Lit team breaking your build Likewise for smaller apps you re likely to end up with a smaller output bundle That s a huge win For smaller applications where performance is critical or simply for the instances where you need to be as close to the DOM as possible vanilla web components can be the way to go That said it s not all roses After all this series has already demonstrated that things like event passing and prop binding are verbose compared to Lit Plus things may not be as good as they seem when it comes to performance Incremental RenderingOn top of the aforementioned issues with avoiding a framework like Lit something we haven t talked about much is incremental rendering A great example of this would come into play if we had an array of items we wanted to render and weren t using Lit Every time we needed to add a single item to that list our innerHTML trick would end up constructing a new element for every single item in the list What s worse is that every subelement would render as well This means that if you have an element like this lt li gt lt a href gt lt div class flex p bg yellow gt lt span gt Go to this location lt span gt lt div gt lt a gt lt li gt And only needed to update the text for a single item in the list you d end up creating more elements for the item you wanted to update…On top of recreating the nodes including the Text Node for every other item in the list Building Your Own FrameworkAs a result of the downsides mentioned many that choose to utilize vanilla web components often end up bootstrapping their own home grown version of Lit Here s the problem with that You ll end up writing Lit yourself sure but with none of the upsides of an existing framework This is the problem with diving headlong into vanilla web components on their own Even in our small examples in the article dedicated to vanilla web components we emulated many of the patterns found within Lit Take this code from the article lt script gt class MyComponent extends HTMLElement todos connectedCallback this render This function can be accessed in element query to set internal data externally setTodos todos this todos todos this clear this render clear for const child of this children child remove render this clear Do logic customElements define my component MyComponent lt script gt Here we re writing our own clear logic handling dynamic value updates and more The obvious problem is that we d then have to copy and paste most of this logic in many components in our app But let s say that we were dedicated to this choice and broke it out into a class that we could then extend Heck let s even add in some getters and setters to make managing state easier lt script gt Base js class OurBaseComponent extends HTMLElement connectedCallback this doRender createState obj return Object keys obj reduce prev key gt This introduces bugs prev key obj key prev key get gt prev key set val gt this changeData gt prev key val changeData callback callback this clear this doRender clear for const child of this children child remove doRender callback this clear callback lt script gt Now our usage should look fairly simple lt script gt MainFile js class MyComponent extends OurBaseComponent state createState todos render this doRender gt this innerHTML lt h gt You have this state todos length todos lt h gt customElements define my component MyComponent lt script gt That s only lines to declare a UI component Only now you have a bug with namespace collision of state with underscores your doRender doesn t handle async functions and you still have many of the downsides listed below You could work on fixing these but ultimately you ve created a basis of what Lit looks like today but now you re starting at square one No ecosystem on your side no upstream maintainers to lean on Pros and Cons of Lit FrameworkWith the downsides and upsides of vanilla web components in mind let s compare the pros and cons of what building components using Lit looks like ProsFaster re renders that are automatically handledMore consolidated UI logicMore advanced tools after masterySmaller footprint than other frameworksConsFramework knowledge requiredFuture breaking changesNot as widely known used as other frameworks Vue React Angular While there is some overlap between this list of pros and cons and the one for avoiding Lit in favor of home growing there s a few other items here Namely this table highlights the fact that Lit isn t the only framework for building web components There s huge alternatives like React Vue and Angular These ecosystems have wider adoption and knowledge than Lit which may make training a team to use Lit more difficult However Lit has a key advantage over them ignoring being able to output to web components for a moment we ll come back to that Even compared to other frameworks Lit is uniquely lightweight Compare the bundle sizes of Vue a lightweight framework in it s own right compared to Lit While tree shaking will drastically reduce the bundle size of Vue for smaller applications Lit will still likely win out for a simple component system Other FrameworksLit framework isn t alone in being able to output to web components however In recent years other frameworks have explored and implemented various methods of writing code for a framework that outputs to web components For example the following frameworks have official support for creating web components without changing implementation code VueAngularPreactVue in particular has made massive strides in improving the web component development experience for their users What s more is that these tools tend to have significantly larger ecosystems Take Vue for example Want the ability to change pages easily Vue RouterWant a global store solution VuexPrefer similar class based components Vue Class Component LibraryPrebuilt UI components Ant DesignWhile some ecosystem tools might exist in Lit they certainly don t have the same breadth That s not to say it s all good in the general web component ecosystem Some frameworks like React have issues with Web Component interop that may impact your ability to merge those tools together Why Web Components You may be asking if you re going to use a framework like Vue or React anyway why even bother with web components Couldn t you instead write an app in one of those frameworks without utilizing web components You absolutely can and to be honest this is how most apps that use these frameworks are built But web components play a special role in companies that have multiple different projects Consolidation Let s say that you work for BigCorp the biggest corporation in Corpville BigCorp has dozens and dozens of full scale applications and not all of them are using the same frontend framework This might sound irresponsible of BigCorp s system architects but in reality sometimes a framework is better geared towards specific applications Additionally maybe some of the apps were part of an acquisition or merger that brought them into the company After all the user doesn t care or often know about what framework a tool is built with You know what a user does care about The fact that each app in a collection all have vastly different UIs and buttons While this is clearly a bug if both codebases implement the buttons on their own you ll inevitably end up with these types of problems this being on top of the work hours your teams have to spend redoing one another s work for their respective frameworks And that s all ignoring how difficult it can be to get designers to have consistency between different project s design components like buttons Web Components solve this problem If you build a shared component system that exports web components you can then use the same codebase across multiple frameworks Once the code is written and exported into web components it s trivial to utilize these new web components in your application Like it can be a single line of code trivial From this point you re able to make sure the logic and styling of these components are made consistent between applications even if different frameworks ConclusionWhile web components have had a long time in the oven they came out swinging And while Lit isn t the only one at the table they ve certainly found a strong foothold in capabilities Lit s lightweightness paired with web component s abilities to integrate between multiple frameworks is an incredible one two punch that makes it a strong candidate for any shared component system What s more the ability to transfer knowledge from other frameworks makes it an easy tool to place in your toolbox for usage either now or in the future Regardless whether you re using Vue React Angular Lit Vanilla Web Components or anything else we wish you happy engineering 2022-04-04 17:30:40
海外TECH DEV Community How to read a CSV File in C# (Step by Step Tutorial) https://dev.to/bristolsamo/how-to-read-a-csv-file-in-c-step-by-step-tutorial-19pn How to read a CSV File in C Step by Step Tutorial a miles easier manner to have your software percentage information is by analyzing and writing Comma Separated Values CSV documents CSV files can without difficulty be examined and written with the aid of many programs including Microsoft Excel For the top part studying and writing CSV files is trivial Because of the call suggestions a CSV document is virtually a simple textual content file that includes one or more values in step with the line separated by commas Each fee is a subject or column in a spreadsheet and every line is a report or row in a spreadsheet But there are slightly more paintings concerned Double fees are used to wrap values that incorporate commas so that the commas aren t interpreted as a cost separator The equal is also performed for values that contain double prices In addition double fees collectively characterize a double quote within the fee and not a fee separator Working with numerous Excel codecs often requires reading facts reconfiguring them programmatically In this article we can learn how to study a CSV record and parse records from an Excel spreadsheet in C using IronXL the precise tool for the job What are CSV files CSVs Comma Separated Values are famous import and export information codecs utilized in spreadsheets and databases Usually statistics are saved on one line and separated using commas It s miles essential to use our CSV assist package for our initiatives CSV is a simple information layout but there can be many differences Those might also encompass one of a kind delimiters new lines or costs It s miles possible to study and Write CSV data with the assistance of the CSV help library Load Workbook and Access WorksheetThe workBook is the magnificence of IronXL whose item presents complete get entry to the Excel record and all of its features For instance if we want to get an entry to an Excel file we would use the code WorkBook wb WorkBook Load sample xlsx Excel file pathTo get admission to the precise WorkSheet of an Excel document IronXL presents the WorkSheet class WorkSheet ws wb GetWorkSheet Sheet by sheet nameOnce you ve received the ExcelSheet ws you can extract any kind of information from it and perform all Excel functions data can be accessed from the ExcelSheet ws on this method using IronXL static void Main string args WorkBook wb WorkBook Load sample xlsx WorkSheet ws wb GetWorkSheet Sheet foreach var cell in ws A A Console WriteLine value is cell Text Console ReadKey The foreach var is used to loop across each cell in the workbook and the values are stored in a string array How to read a CSV file in c and store data as var valuesif you have CSV documents rather than an excel report you have to make slight adjustments to the code to read a CSV file Allow s to examine the subsequent instance I have the same weather record saved as weather csv so I m using the equal report for all the examples Again you could use an example in keeping with your necessities WorkBook workbook WorkBook LoadCSV Weather csv fileFormat ExcelFileFormat XLSX ListDelimiter WorkSheet ws workbook DefaultWorkSheet workbook SaveAs Csv To Excel xlsx The primary line will load the CSV file and outline the document layout The second one line will pick out the worksheet simultaneously as the alternative code strains are similar to those inside the preceding examples Each CSV file line is a var line the foreach is used to loop over each variable line using the var reader The read data is then stored in a var path OUTPUTthe read row is the first row in the CSV file is the header row All the rows are displayed on a separate line A Workbook item is created The LoadCSV technique for the Workbook object is then used to specify the call of the CSV its format which delimiters are used in the CSV file being examined to separate var row In this case commas are used as delimiters A Worksheet object is then created This is where the contents of the CSV file might be placed Then the file is saved beneath a brand new call and layout How to use a CSV parser in C to use a parser you need first to read a CSV file CSVs have an abundance of troubles with how line breaks are handled in fields or how fields can be contained in fees that block a simple string break up method I ve lately discovered the following options while converting CSV files to C files Internet it s the first time I ve ever used simply one string split instead of an easy Strings cutup to split the values in a comma In this article we can look at the great ways in which C has a CSV parser feature in C XcPC internet It is simple to create a CSV parser With the most effective lines of code you could load a CSV file and convert it to Excel private void button Click object sender EventArgs e WorkBook wb WorkBook Load Normal Excel File xlsx Import xls csv or tsv file wb SaveAs Parsed CSV csv Exported as Parsed CSV Sheet csv After you have set up IronXL create a brand new project and add the IronXL namespaceusing IronXL Load a CSV record into Excelthe following code uses the Workbook object s Load approach to load a comma separated CSV file into Excel The CSV is read as a string array containing string columns This document is then parsed finally it uses the SaveAs method to keep the file within the CSV file format private void button Click object sender EventArgs e WorkBook wb WorkBook Load Normal Excel File xlsx Import xls csv or tsv file wb SaveAs Parsed CSV csv Exported as Parsed CSV Sheet csv Export the Parsed CSVThe exported CSV file will be stored as Parsed CSV Sheet csv because the records are on Sheet inner of the Excel Workbook Under what the file would appear like in the report Explorer when selected This export is done by reading the store data in the var path Parsing records using StreamReaderThe first test is to make sure the record to parse exists Inside the following code block mHasException and mLastException are from a base exception elegance which the class for parsing inherits The go back kind is a ValueTuple established the usage of NuGet package deal supervisor if File Exists inputFileName mHasException true mLastException new FileNotFoundException Missing inputFileName return mHasException new List lt DataItem gt new List lt DataItemInvalid gt If the document exists the subsequent step is to set up several variables for you to be used for validation functions and go back sorts in an effort to comprise legitimate and if presented invalid statistics while examine in information from the CSV record var validRows new List lt DataItem gt var invalidRows new List lt DataItemInvalid gt var validateBad int index int district int grid int nCode float latitude float longitude Reading CSV Data in C Records This process advances the reader through the following document We study the comma separated file CSV discipline files in trygetfield We use the study feature on the subject fields of the CSV files as report fields The following example shows how it is done The foreach var values read each string in the CSV file converted to a tabular form and a new record is created The first string of the CSV file is known as the public string Double quotes are used to delimit strings WorkBook workbook WorkBook LoadCSV Weather csv fileFormat ExcelFileFormat XLSX ListDelimiter WorkSheet ws workbook DefaultWorkSheet DataTable dt ws ToDataTable true parse sheet of sample xlsx file into datatable foreach DataRow row in dt Rows access rows for int i i lt dt Columns Count i access columns of corresponding row Console Write row i Console WriteLine The output produces a string line which is displayed as follows Save Workbook to CSV filewe can convert an excel workbook into a CSV file by converting the tabular form string rows separated by commas The following code uses the Workbook object s Load approach to load a record into Excel Then it uses the SaveAs method to store data in the CSV file within the preferred layout in this example csv What s thrilling here is that it appends the worksheet s name onto the filename which is a quite nifty reminder of where the statistics are coming from the source code below is a perfect example private void button Click object sender EventArgs e WorkBook wb WorkBook Load Normal Excel File xlsx Import xls csv or tsv file wb SaveAs Excel To CSV csv Exported as Excel To CSV Sheet csv The output CSV file seems like the following while opened in an everyday text Editor including Notepad How to Use C to Convert Datatable to CSV fileyou could export a data table to a comma separated file CSV with IronXL using taking an existing facts table and changing it to a CSV file in only some steps this newsletter goal is to reveal to you a short instance of thisFirst import the IronXL namespaceusing IronXL Then add the following code private void button Click object sender EventArgs e DataTable table new DataTable table Columns Add Example DataSet typeof string table Rows Add table Rows Add table Rows Add table Rows Add table Rows Add table Rows Add table Rows Add WorkBook wb WorkBook Create ExcelFileFormat XLS wb Metadata Author OJ WorkSheet ws wb DefaultWorkSheet int rowCount foreach DataRow row in table Rows ws A rowCount Value row ToString rowCount wb SaveAsCsv Save DataTable CSV csv Saved as Save DataTable CSV Sheet csv The above code creates a statistics desk and a brand new workbook specifying OJ as its owner writer A foreach var loop follows that inserts the records from the facts table into the Excel Worksheet Lastly the SaveAsCsv technique exports the data table to a CSV file The pointer moves from one line to the following line using a foreach var or a new streamreader Each row is read as a new string and is attributed as a string name the first line of the CSV file is read as the string column that contains the string column names You could download the software program product from this hyperlink The output Excel Worksheet appears as follows C Export from CSV to XLSX FileIronXL affords the maximum on hand way to export the data to Excel with xls xlsx and csv documents in net programs it s also viable to export the records to json and xml documents let s see one after the other how clean it may be to export Excel document statistics into these codecs First you need to read a CSV file it s straightforward to export an Excel report with a xlsx extension Let s see the instance in the code underneath our XlsFile xls report exists in bin gt Debug folder of the challenge Consider take into account writing down an extension with document call simultaneously as importing or exporting By default new Excel documents will be created within the bin gt Debug folder of the task If we want to create a brand new report in a custom path we will use wb SaveAs E IronXLNewXlsxFile xlsx examine the tutorial here to examine greater approximately how to export Excel files in internet using IronXL static void Main string args WorkBook wb WorkBook Load XlsFile xls Import xls csv or tsv file wb SaveAs NewXlsxFile xlsx Export as xlsx file Further to CSV parsing in C IronXL converts CSVs to Excel with just two strains of code The usage of C it s far very smooth to use IronXL s Excel API without the want for Interop You can examine edit and create Excel spreadsheets or work with different Excel codecs consisting of XLS XLSX CSV TSV With the assistance of a couple of frameworks you could purchase five merchandise for the fee of two Click on here for similar information IronXL license keys permit you to deploy your mission stay with no watermark Licenses start at just and encompass one free months of help and updates With a trial license key you may also strive for IronXL loose for days Iron software offers you the possibility to seize their complete package at a lower fee The iron software suite comprises five additives IRONPDF IRON XL IRONOCR IRONBARCODE and the IRONWEBSCRAPER You may get the whole bundle at the most exact percent price by paying in one installment It s undoubtedly a possibility worth going for You can download the software product from this link 2022-04-04 17:29:29
海外TECH DEV Community Amplication & React: Create the Backend https://dev.to/amplication/amplication-react-create-the-backend-43lf Amplication amp React Create the BackendWelcome to this tutorial on how to build a full stack application with Amplication What we will do is go step by step to create a Todos application using React for your frontend and Amplication for your backend If you get stuck have any questions or just want to say hi to other Amplication developers like yourself then you should join our Discord Table of ContentsStep Create a New AppStep Create an EntityStep Create a RoleStep Assign PermissionsStep Build the BackendStep Run the BackendStep Wrap Up Step Create a New AppHopefully you ve had the chance to create an Amplication account but if not don t fret Visit and you ll be directed to the login screen Here you can log in to an existing Amplication account or create one by signing in with a GitHub account You should end up at the New App page but if not you can get to it here Click the New App button in the top right corner Select Start from Scratch and wait a few seconds for the app to be generated You ll be directed to the application s entities An entity is equivalent to a collection in a NoSQL database or a table in a relational database By default a User entity is created for you This entity will eventually help us handle authentication But first let s deal with the backend Step Create an EntityThe main entity will be used to store tasks created by users Click Add entity When a New entity modal appears input Task into the input field and click Create Entity With the entity created we ll want to define a few fields for task elements On the left hand panel you ll see the Fields this entity has and at the very bottom there will be an option to add a field The first field will be Text Type that into the Add field input and hit enter The new field will be created and a few options will appear Notice a dropdown for the Data Type of this field is set to Single Line Text That s perfect as it ll be a string input of a task There are many different data types Amplication can enforce for fields The only change that needs to be made here is this will be a required field Toggle the Required Field switch Changes will be automatically saved Like before create a new field called Completed This field should also be a required field but we will change the data type Click the Data Type dropdown and change this field to be a Boolean The final field we will need to create should be called UID This field will be used to relate a task to a user Mark this as a required field In the Data Type dropdown select Relation to Entity The Related Entity dropdown should appear select User A modal asking to create a relation between a Task and a User will appear Click Create to create the relation To learn more about entity relations there s an article on the docs website here Step Create a RoleAmplication allows for granular permission to create read update and delete entries in the different entities of the backend User s who will be creating tasks in the Todo app will need to be granted certain permissions to create read and update their tasks and prevent them from doing other things Click the Roles icon on the left hand panel Then much like the properties we added to the Task entity create a role called Todo User Step Assign PermissionsWith a role for users of the Todo app created we ll want to assign certain permissions to the Todo User role Click the Entities icon on the left hand panel By default every role has CRUD create read update and delete access to every entity It is important to limit the scope of our Todo users Select the User entity from the list of entities and on the left hand panel select Permissions Every type of command is granted to All Roles Users with the User or Todo User role have full access to the User entity This can be dangerous The default admin account created by the backend has the role User so we don t want to mess with that What we will eventually do is have it so all new users are assigned the Todo User role and we will limit their access to certain commands Toggle the permissions for each of the entity s commands to Granular and toggle on the User role Now the only user who can access User accounts will have the User role which will belong only to the admin account Return to the Entities page and now select the Task entity Click Permissions Toggle the Delete command to Granular and enable access to the User role Both Users the admin and Todo Users regular users of the app will be able to create read and update tasks but only Users will be able to delete tasks Step Build the BackendWith the new Task entity created and a relation with User s created We re now ready to build the backend On the right side panel is the Pending Changes where the changes to Task and User will appear Click Commit changes amp build to finalize the changes as well as to deploy an instance of the backend into a sandbox environment On the bottom of the page there s a status button with the text Preparing sandbox environment Clicking that will route you to a log of the backend being dockerized and deployed This takes a few minutes but once complete you can see the backend by clicking the Open Sandbox environment but we will not be using the sandbox for the Todo app Amplication by default creates a secure environment where all requests need to be authenticated For this use case we will want to ease some of those protections Thanks to Amplication s extensibility we can build on top of everything that was generated for us We ll start by downloading the backend In the bottom right of the page you ll see a download button Click that and you ll download a zip file containing all of the code to run the backend Extract the zip file and copy all of the contents except for the README md to the root of the amplication react project Step Run the BackendThe admin ui and server folders generated by Amplication are two new node projects that need to be set up One thing both will need is their dependencies In the package json update the postinstall script postinstall npm ci prefix web amp amp npm ci prefix admin ui amp amp npm ci prefix server Open a new terminal and run the command below This command will install the dependencies of all the subfolders Another useful aspect of this command is that if you were to push this project to GitHub and cloned the repo when you run npm install this script will occur after the install to install the dependencies of the subfolders automatically npm run postinstallThere will be some minor conflicts with the code create react app created for us and the code Amplication created for us This should be easy to correct though Install cross env and npm run all as a dev dependency as follows npm install D cross env npm run allUpdate the start script in package json and add the script below as well By doing this the Todo app UI will now run on port during development so it won t conflict with Amplication s default port for the server which is We ve also set the start to script to run our frontend and backend code at the same time start npm run all p start frontend start backend start frontend cross env PORT npm prefix web start start admin npm prefix admin ui start start backend npm prefix server start Before starting the server there are a few additional steps required Read server README md for directions to Create a Prisma clientStart a database in DockerInitiate the databaseWhen those steps have been completed run the following command npm run start Step Wrap UpThe frontend of the Todo app will be running at http localhost and the backend will be running at http localhost Visiting http localhost will greet you with a error Instead visit http localhost api to see all the endpoints of the backend and to see what our REST endpoints will look like With our backend created and running locally we re almost ready to link it to the frontend First we ll need to make some additions to the code Check back next week for step three or visit the Amplication docs site for the full guide now To view the changes for this step visit here 2022-04-04 17:17:47
海外TECH DEV Community The journey of sharing a wired USB printer over the network https://dev.to/maciekmm/the-journey-of-sharing-a-wired-usb-printer-over-the-network-fg8 The journey of sharing a wired USB printer over the networkI was in the market for a printer that was cheap to buy and cheap to run I did not print in color so I concluded that a dot matrix laser printer would be a good choice I looked up a couple of units and decided on Brother DCP as it was on sale for with replacement toners running for apiece Not a bad deal It had one caveat no ethernet port no WiFi support and no Internet Printing Protocol I did not need all the features but an ethernet port would be niceCourtesy of XKCDThat did not put me off I have never been printing much and could live with the cable I also knew I could attach it to a print server such as CUPS and add networking capabilities to the network impaired printer I had a Raspberry Pi Zero sitting around and I wouldn t hesitate to use it This post is a story about the journey I experienced setting it up The driver support or lack thereofI flashed Arch Linux ARM onto my RPi and went driver hunting It turned out the manufacturer does not support the ARM architecture After a couple of DDG searches I found brlaser a community driven Brother driver Perfect I installed CUPS compiled the driver and shared the printer over the network Brlaser as seen in the driver selection prompt I clicked the Print button Why does it take seconds to print a single page‽My eyes turned towards the printer anticipating the first sheet of paper to appear quickly but nothing happened Not until I tried to ssh into the Pi and started debugging After a couple of seconds I began to hear the brrr printer noise while it was spitting out the page I tried to print a second one as I thought it needed some priming but it also took well over half a minute I clicked Print again and began observing the CUPS interface It took s to get through the Processing Page state Something was off As it turns out Raspberry Pi Zero is not that powerful The driver uses ghostscript which pegged the CPU usage to Back to the drawing board Can I share an arbitrary USB device over the network Sure I can I wasn t aware of the RAW queue mode in CUPS at the time but I heard about usbip which sounded like a compelling solution to the problem or so I thought I followed a great Arch Wiki tutorial on usbip and sure enough I had my printer attached The need to load a kernel module was a bit unsettling but I installed the drivers and configured the printer using CUPS successfully I hit Print and almost immediately had the page handy Whoa that was easier than expected It s not all roses I rebooted my laptop the next day and tried to print again a real document this time It wouldn t work the device was gone I couldn t detach the device I couldn t attach a new one I tried reloading the kernel module and it wouldn t work either nada At that point I concluded I m not in favor of running a kernel module that s misbehaving I also had some security concerns around the whole architecture and therefore I uninstalled it Learning about RAW queuesI came back to the problem after a couple of days but now I had a powerful tool under my belt knowledge about CUPS RAW queues A RAW queue is a queue where the filtering system is not involved and the print job goes directly to a printer or another queue This seemed promising for my use case The idea was to set up a RAW queue on the Pi and then do the heavy lifting filtering on significantly more powerful user machines PC Raspberry PI filtering gt raw queue gt Printer brlaser gs no filtering I quickly compiled this setup and it worked out perfectly Note CUPS plans to deprecate drivers and raw queues in the future because of how wide spread IPP has become I don t consider this to be a big issue you can always pin the CUPS version There s still a ton of old hardware out there and that won t change quickly and doesn t have to The SD card gives upAfter a few weeks the SD card running the CUPS server gave up I bought a new one and tried to quickly reinstall everything but Arch Linux ARM abandoned armv architecture in the meantime I decided to use Raspberry Pi OS and automate the setup Automating the build processI like to have my infrastructure defined in code and I maintain a number of Ansible playbooks and Terraform workspaces to control my servers Packer seemed like the perfect tool for the job I have never used it before and wanted to get familiar with the tool It doesn t come with ARM support out of the box but there are two community projects to fill that niche I tried the packer builder arm first but it couldn t run my Ansible playbook due to a bug I applied a patch but quickly ran into other issues At that point I decided to use packer plugin arm image instead The setup did not work out of the box but after a simple PR it built my first empty image and proved it s possible to build an ARM image locally Defining goalsI wanted an end to end setup with the following functionality OS installation Raspberry Pi OS WiFi setupAutodiscovery via mDNS amp DNSSDCUPS installationPrinter configurationI developed two Ansible playbooks to accomplish that I will skip the technicalities and maybe write about the details another time Run it yourself The final solution with a comprehensive README can be found under the maciekmm printer rpi image github repo Building flashing and running the image yourself is as simple as running build the base imageWIFI SSID lt SSID gt WIFI PASSWORD lt PASSWORD gt SSH PUBLIC KEY cat ssh id ed pub vagrant up flash the image onto the sd carddd bs M if output raspberry pi os img of lt sdcarddevice gt amp amp sync NOTE this runs on the live Pi Connect it first with printers attached configure the printer and firewallsansible playbook i hosts live yamlWhat s left is configuring the driver and discovering the printer locally This can be done by installing CUPS locally and running through the wizard Closing thoughtsI accomplished several things I made my wired printer wireless I learned how to use Packer I learned a bit about CUPS queues I published an open source project for you to be able to do the same Overall this was an interesting project and I m happy to share this story 2022-04-04 17:17:03
海外TECH DEV Community Make a URL Shortener with SvelteKit https://dev.to/spences10/make-a-url-shortener-with-sveltekit-46c0 Make a URL Shortener with SvelteKitURL shorteners use them for when you want to share an easy to remember link You can use a service like Bitly or TinyURL or any of the other ones out there already or you could make it something you d want to use and have a bit more of a connection to it by making your own In the past I ve made a personal URL shorteners with a Netlify redirects file and with a Vercel vercel json file In this post I m going to make a URL shortener with SvelteKit I will use a SvelteKit endpoint to redirect the requests made to it This will redirect the source URL to the target or destination URL An example could be the URL given to this project say anything after the TLD li will be the source so will be redirected to the destination URL for that source In the previous two projects I made there wasn t anything in the way of a front end framework as they were just configuration files to do the redirects on the server This is still pretty much the same as it will be taking an incoming request on the server in the SvelteKit endpoint and redirecting it Setup the projectI ll scaffold out a skeleton SvelteKit using the following command npm init svelte next svort urlsI ll follow the prompts I ll be yes to all the prompts there Which are Which Svelte app template › Use arrow keys Return to submit SvelteKit demo app❯Skeleton projectWhich Svelte app template ›Skeleton projectUse TypeScript …YesAdd ESLint for code linting …YesAdd Prettier for code formatting …YesAdd Playwright for browser testing …YesI m not going to be covering browser testing in this post but it s nice to have the config there if you need it Create the endpointIn the routes folder I ll create a new slug ts file touch src routes slug tsThe slug ts file is an endpoint a HTTP endpoint in SvelteKit you can use HTTP methods in endpoints So if I want to GET some data in a route I can access it via these special SvelteKit files In this case I m using a GET method so the source can be redirected to the destination export const get async gt return headers Location status This will accept anything after root path and redirect it at the moment back to the homepage So going to localhost me will redirect to localhost That is pretty much it For the list of links I ll be using a local config file but you can use something like a CMS or a database to control this Add source and destination URLsI m going to add the source and destination URLs to a config file In SvelteKit the place for this would be in a lib folder create the lib foldermkdir src lib create a file for the urlstouch src lib urls list tsIn the urls list ts file I ll add the source and destination URLs I want I ll add some example one here export const urls source me destination source twitter destination source git destination Redirect to the destination URLWith my list of short links in place I can use them in the slug ts endpoint so going to localhost me I will now want redirect to I ll need a way to know what the source URL is in the endpoint so I can destructure that out of the context passed tp the endpoint Let s take a quick look at what we get in the context or ctx object export const get async ctx gt console log ctx return headers Location status So now if I navigate to localhost something I ll see the following output in the terminal request Request size follow compress true counter agent undefined highWaterMark insecureHTTPParser false Symbol Body internals body null stream null boundary null disturbed false error null Symbol Request internals method GET redirect follow headers Object parsedURL URL signal null referrer undefined referrerPolicy url URL href http localhost something origin http localhost protocol http username password host localhost hostname localhost port pathname something search searchParams URLSearchParams hash params slug something locals platform undefined So what I m interested in here is the url object more specifically the url pathname This is going to help me identify where I want the request to be redirected to url URL href http localhost something origin http localhost protocol http username password host localhost hostname localhost port pathname something search searchParams URLSearchParams hash I could also use params slug object for this as well params slug something In this example I ll be using the url object So I ll destructure the url object out of the context object and import the urls list file import urls from lib urls list export const get async url gt return headers Location status Then I can get a redirect out of the urls array I ll declare this this as a redirect variable So I ll see what I get if I log out the contents of redirect now I m going to want to filter for anything that matches the url pathname from urls list file so for now I ll console log out the results import urls from lib urls list export const get async url gt const redirect urls filter item gt console log item return headers Location status Now if I navigate to localhost something I ll see the following in the terminal source me destination source twitter destination source git destination Sweet So now I can use some logic to determine if the url pathname matches what s in the urls array So with the item I m using in the filter I can compare against the url pathname If there s a valid match I can get the destination from the urls array import urls from lib urls list export const get async url gt const redirect urls filter item gt item source url pathname if redirect return headers Location redirect destination status else if redirect amp amp url pathname length gt return headers Location status else return I can use an if to check for a valid match If there s a valid match then set the headers Location to the destination from the urls array If it doesn t match I ll redirect to the homepage and have a final catch to return an empty object ConclusionThat s it I ve created a simple redirect in SvelteKit that will take an incoming URL and redirect to a destination URL I can now use the homepage as a landing page for my short URLs so anyone coming to the site can check out any of the available links Further explorationThis has been a bit of an eye opener for me so I think I m going to experiment with using something in the way of a backend Undecided yet but I could make this something for users and not just a personal project for me This would involve authentication and something to store the user data i e a CMS more than likely GraphCMS or a database of some sort I ve not checked out planet scale yet so could take a look at that AcknowledgementsThanks to Rainlife over on the Svelte Discord for suggesting the use of HEAD as I m only interested in the header of the request Also thanks to Jordan also on the Svelte Discord for giving me this handy MDN link for Redirections in HTTP Also Dana Woodman on Dev to for using redirects in SvelteKit endpoints I was using redirect instead of setting the headers 2022-04-04 17:16:58
海外TECH DEV Community C# csv parser (Step by Step Tutorial) https://dev.to/bristolsamo/c-csv-parser-step-by-step-tutorial-25ok C csv parser Step by Step Tutorial It s nearly a proper passage for a junior developer to cludge collectively their own CSV parser the use of an easy string split and then sooner or later a training session that there is a little bit more to this whole CSV thing than simply setting apart out values by way of a comma In truth CSVs have many nuances around how line breaks are handled within fields or how areas can be contained inside costs that destroy an absolute string cut up method What is CSV CSVs Comma Separated Values are famous import and export data formats used in spreadsheets and databases Commonly information is saved on one line and separated using commas It s miles important to apply our CSV help package deal for our tasks CSV is an easy statistics layout but there can be many differences Those may additionally consist of different delimiters new strains or rates It s miles feasible to read and Write CSV statistics with the assistance of the CSV assist library CSV GotchasMore than one CSV problem must be brought up before we dive deeper With a bit of luck they must explain why rolling your personal is extra ache from time to time than it s well worth A CSV might also or may not have a header row If there s a header row then the order of the columns isn t always crucial because you may hit upon what is in reality in each column If there may be no header row then you depend on the order of the columns being identical Any CSV parser must be able to use each study column based totally on a header price and index Any field may be contained in rates But areas that incorporate a line damage commas or quotation marks should be included in fees To emphasize the above line breaks within a discipline are allowed within a CSV so long as they re wrapped in quotation marks this is what trips most people up who are studying line by line adore It s a daily textual content report The Quote marks within a field are notated by double quote marks As hostile to mention an escape character like a back curb All rows must have the same number of columns but inside the RFC labeled as a have to and no longer a have to At the same time assure the C in CSV stands for a comma Ideally a CSV parser can also deal with TSV that is the usage of tabs in preference to commas This is just for parsing CSV files into primitives but in something like internet we will additionally be wanting Deserializing right into a list of objectshandling of Enum valuescustom mappings So the header fee can also or won t fit the name of the elegance belongings in C Mapping of nested itemswhat is a CSV file parser The CSV parser elegance is an abstract class that helps to parse a record or stream of comma separated values In summary it should be inherited by way of a programmer developed elegance which ought to at a time minimum put into force the summary techniques Most of the strategies inside the CSVParsermagnificence are virtual permitting the programmer to override their capability with new or supplementary processing The code takes a comma delimited textual content report or flow and parses every line into discrete data fields CSVs have an abundance of troubles with how line breaks are treated in fields or how fields can be contained in prices that block a simple string split method I ve recently discovered the following alternatives while converting CSVs to C Net it is the first time I ve ever used simply one string Split instead of a simple Strings cutup to split the values in a comma In this text we can look at the pleasant methods in which C has a CSV parsing feature in C XcPC net considerations for processing statisticssubsequent need to continually be considered while uploading CSV documents All columns are suspected to be missing altogether or missing in one or greater rows Mixed information kinds don t forget a column with dates in which some rows may additionally have malformed dates dates set up for a great tradition columns that should be numeric had been a few rows have no cost or sudden format and so on Columns that have values that aren t legitimate to your business e g a listing of merchandise that needs to map to a product desk in which there are products that you don t manage However incoming information has values thru a hundred The file is in use using another system and is locked The document is extraordinarily huge and processing time can also take hours have a plan to run a nightly activity Dealing with rows columns that don t healthy in the database have a plan to deal with them Presenting clients a method s to study suspect data alter or reject the information Do not forget an intermediate database desk to process suspect information that can be executed over the years There may be a massive information set that may take hours or days to process Don t forget running with CSV files is a puzzle no matter what the structure must be and that unique parsing documents commonly have their very own quirks Parse CSV Files With the TextFieldParser Class in C to use the TextFieldParser magnificence we have to reference the Microsoft VisualBasic dll in our C code The TextFieldParser elegance consists of many methods for parsing dependent textual content documents in C We can study a CSV file with the TextFieldParser class with the aid of placing the delimiters with the SetDelimiters function inside the TextFieldParser magnificence The below code example shows us how to parse data from a CSV file with the TextFieldParser magnificence in C using System using Microsoft VisualBasic FileIO namespace parse csv class Program static void Main string args using TextFieldParser textFieldParser new TextFieldParser C File Sheet csv textFieldParser TextFieldType FieldType Delimited textFieldParser SetDelimiters while textFieldParser EndOfData string rows textFieldParser ReadFields Inside the above code we initialized the instance textFieldParser of the TextFieldParser class by specifying the path to our CSV document within the constructor We then set our text area type to be delimited with the textFieldParser TextFieldType FieldType Delimited and set as the delimiter with textFieldParser SetDelimiter feature We then used a while loop to examine the CSV file to give up with the textFieldParser EndofData We stored the facts internal an array of strings with the ReadFields feature Parse data from a CSV File With the FileHelpers parser Library in C In C we have a report parser parsing the record based on its contents The TextFieldParser is described in Microsoft VisualBasic FileIO library Earlier than executing the program underneath don t neglect to feature a reference to Microsoft VisualBasic The FileHelpers library examines and writes records to files streams and strings in C It s far from a rd party library and does not come pre hooked with the internet framework We can easily install it by looking at it inside the NuGet bundle supervisor inside the visible Studio IDE We can use the FileHelpersEngine class to parse information from a CSV file in C The FileHelperEngine class gets the records from the document into magnificence objects in C So we must first create a version of elegance that may preserve our statistics from the record The grace would include fields that represent columns within the CSV record We will use the DelimitedRecord to write that the is used as a delimiter here ReadFile path function can also be used to read facts interior and an array of sophisticated objects from the document inside the exact route The subsequent code instance suggests how to parse a CSV file with the FileHelpers library in C using FileHelpers using System namespace parse csv DelimitedRecord public class Record public string Name public string Age class Program static void Main string args var fileHelperEngine new FileHelperEngine lt Record gt var records fileHelperEngine ReadFile C File records csv foreach var record in records Console WriteLine record Name Console WriteLine record Age The above code reads the document and keeps it in an array of objects of the file elegance with the FileHelpers library in C Parsing data using StreamReader In C StreamReader magnificence is used to cope with the documents It opens reads and helps act different features to specific styles of documents We can also perform particular operations on CSV files while using this class Parsing CSV filesFirst check to make sure the report to parse exists The following code blocks mHasException and mLastException are from a base exception magnificence that the class for parsing inherits The go back kind is a ValueTuple hooked up using NuGet package manager for example if File Exists inputFileName mHasException true mLastException new FileNotFoundException Missing inputFileName return mHasException new List lt DataItem gt new List lt DataItemInvalid gt If the document exists the next step is to set up numerous variables to be able to be used for validation purposes and return sorts in order to include valid and if offered invalid facts whilst studying in records from the CSV record var validRows new List lt DataItem gt var invalidRows new List lt DataItemInvalid gt var validateBad int index int district int grid int nCode float latitude float longitude the subsequent code block follows the code block above A while declaration is used to loop thru each line inside the CSV file For every line break up the road through a comma In this case that s the most commonplace delimiter Subsequently validate there are nine factors in the string array If there aren t nine elements in the array locate them into a possible reject container Be aware that the primary line contains skipped column names using checking the index line variety stored inside the variable index Following the check for nine factors in a line of seven factors within the string the array is checked to make sure they can be converted to the anticipated statistics kind starting from date to numerics and empty string values Passing the kind test above the section beneath the comment Questionable fields will do numerous more excellent exams e g does the NICIC field incorporate data that isn t always in an anticipated variety Note all facts ought to be checked here consisting of the statistics in part as this may be subjective to the points in other factors inside the array so this is left to the overview technique so that it will give a grid with a dropdown of validating alternatives to pick from If there are troubles to review a report the property is ready to flag the facts for a manual assessment and a list is created try using var readFile new StreamReader inputFileName string line string parts while line readFile ReadLine null parts line Split index if parts null break index validateBad if parts Length invalidRows Add new DataItemInvalid Row index Line string Join parts continue Skip first row which in this case is a header with column names if index lt continue These columns are checked for proper types var validRow DateTime TryParse parts out var d amp amp float TryParse parts Trim out latitude amp amp float TryParse parts Trim out longitude amp amp int TryParse parts out district amp amp int TryParse parts out grid amp amp string IsNullOrWhiteSpace parts amp amp int TryParse parts out nCode Questionable fields if string IsNullOrWhiteSpace parts validateBad if string IsNullOrWhiteSpace parts validateBad NICI code must be or greater if nCode lt validateBad if validRow validRows Add new DataItem Id index Date d Address parts District district Beat parts Grid grid Description parts NcicCode nCode Latitude latitude Longitude longitude Inspect validateBad gt else fields to review in specific rows invalidRows Add new DataItemInvalid Row index Line string Join parts catch Exception ex mHasException true mLastException ex AS soon as the above example has finished the subsequent line of code will write facts to the calling form window that s a ValueTupler return IsSuccessFul validRows invalidRows Parsing data using OleDbThis approach reads traces from CSV files with the drawback of all fields aren t typed and carry extra luggage than wished for processing lines from the CSV file an excellent way to make a difference in time to the manner with large CSV files like in the example below public DataTable LoadCsvFileOleDb var connString Provider Microsoft Jet OleDb var dt new DataTable try using var cn new OleDbConnection connString cn Open var selectStatement SELECT FROM Path GetFileName inputFileName using var adapter new OleDbDataAdapter selectStatement cn var ds new DataSet Demo adapter Fill ds ds Tables TableName Path GetFileNameWithoutExtension inputFileName dt ds Tables catch Exception ex mHasException true mLastException ex return dt Parsing CSV files in C using IronXLIronXL is a internet Library development introduced to you with the aid of Iron software This library affords first rate features and APIs to help us examine create and replace edit our excel files and spreadsheets IronXL does now not require Excel to be established in your server or Interop Furthermore IronXL affords a faster and more intuitive API than Microsoft workplace Interop Excel With IronXL it is pretty easy to parse data from CSV files it is easy to create a CSV parser With the most precise two lines of code you could load a CSV file and convert it to Excel Adding the IronXL Nuget Packageearlier than you can employ IronXL to examine CSV files in MVC or ASP or net center you need to install it Here is a brief stroll via In Visual Studio pick out the undertaking menumanipulate NuGet packagessearch for IronXL Exceldeploywhen you need to read CSV documents in C IronXL is the best tool You could study a CSV file with commas or every other delimiter as seen in the code segments underneath WorkBook workbook WorkBook LoadCSV Weather csv fileFormat ExcelFileFormat XLSX ListDelimiter WorkSheet ws workbook DefaultWorkSheet workbook SaveAs Csv To Excel xlsx Create a New ProjectAfter you have got set up IronXL create a new task and add the IronXL namespaceusing IronXL Load a CSV File into Excelthe code below uses the Workbook object s Load method toload CSV files into Excel This file is then parsed finally it uses the SaveAs technique to store the report in CSV format private void button Click object sender EventArgs e WorkBook wb WorkBook Load Normal Excel File xlsx Import xls csv or tsv file wb SaveAs Parsed CSV csv Exported as Parsed CSV Sheet csv remember to create an Excel Workbook named Normal Excel File xlsx containing the subsequent recordsExport the Parsed CSV filethe exported CSV document may be saved as Parsed CSV Sheet csv because the data is on Sheet interior of the Excel Workbook Beneath is what the report would look like in report Explorer IronXL license keys permit you to deploy your mission stay with no watermark Licenses start at just and encompass one free months of help and updates With a trial license key you may also strive for IronXL loss for days Iron software offers you the possibility to seize their complete package at a lower fee The iron software suite comprises five additives IRONPDF IRON XL IRONOCR IRONBARCODE and the IRONWEBSCRAPER At the most exact percent price you may get the whole bundle by paying in one installment It s undoubtedly a possibility worth going for You can download the software product from this link 2022-04-04 17:16:18
海外TECH DEV Community Will leaving my job for mental health reasons ruin my professional reputation? https://dev.to/johnwoody/will-leaving-my-job-for-mental-health-reasons-ruin-my-professional-reputation-47m9 Will leaving my job for mental health reasons ruin my professional reputation I work as a web dev designer for a digital marketing agency and I am incredibly burnt out A few months ago there were some legal issues that came up that I was asked to resolve Since then my life has been hell I feel anxious all the time about making mistakes About a month ago I asked to speak with my boss because I wanted to know if I was causing issues within the company During this chat I broke down and cried and he reassured me I was doing well Fast forward to this past week I was asked to design tshirts business cards make social media posts for a company event coming up I was given a week to complete these I also have my other usual web dev tasks like building new sites making updates for the sales team etc I also was asked to have a complete redesign of the company website done by April I was feeling so overwhelmed and asked my boss if I could have focused time to work on the company site because I get pulled into big tasks that take a lot of time to complete He told me that I can work on it a few hours a day but I still need to do the usual work I broke down again and basically told him this was a lot of work He asked me if I had depression and in the end I disclosed that I have struggled with anxiety and depression in the past Well since then the sales and production team have been very stand offish and passive aggressive On Friday at am they asked me to completely redo one of their sites and write new content so they could get approval for Adsense The due date Monday morning I decided to just use a free template because that was the only feasible way I would get it done in time I honestly just want to quit my job but I m terrified that I have ruined this company as a future reference because of my past breakdowns I plan to look for new jobs but I m honestly dreading going back to the office I just don t know what to do I ve been working so hard and I feel like I ruined my professional reputation for breaking down in front of my boss 2022-04-04 17:14:33
海外TECH DEV Community A Three.js 3D world with Character + Movements + Third Person Camera https://dev.to/louis3797/a-threejs-3d-world-with-character-movements-third-person-camera-325g A Three js D world with Character Movements Third Person CameraHey this is my first project with Three js so im relatively new to this library However I want to show you what I built and maybe it will help one or the other GitHub RepoNote that there are still some bugs or not included functions like object collision Built with reactreact three fiberreact three dreithree jstypescriptsimplex noise Features DNature ObjectsSimplex Noise FloorD CharacterAnimations Idle Walk Run DanceMovement with w a s dDance with eThird Person CameraMy GitHub 2022-04-04 17:10:31
海外TECH DEV Community Finally recieved HactoberFest2021 Swags... https://dev.to/durgesh2001/finally-recieved-hactoberfest2021-swags-3jc4 Finally recieved HactoberFest Swags 2022-04-04 17:10:11
海外TECH DEV Community Each Logic in Svelte https://dev.to/smpnjn/each-logic-in-svelte-5gm4 Each Logic in SveltePreviously I covered how if else logic works in Svelte Today I ll be going over how to use each logic in Svelte letting us iterate over objects to display their contents If you re new to Svelte read my guide on how to create your first Svelte project here Motivation for each logic in SvelteLet s say we are making a component in Svelte as covered in my guide on Svelte components We have a set of data and some basic CSS In Svelte this looks like a normal HTML CSS Javascript file lt script gt let locations country UK city London country India city Mumbai country France city Paris lt script gt lt div id data gt lt div gt lt style gt data padding rem border radius px border px solid eee lt style gt Without Svelte if we want to display all of that data we would have to do a loop through it and generate the HTML with Javascript With Svelte it s much easier to loop through data we simply use each logic How each Logic in Svelte WorksWith each we can take our data div and iterate through our locations object right within it The each loop in Svelte has the following syntax each locations as country city i └ the index of the current item └ all the properties we want to access from locations └ the variable in our Javascript to iterate over └ the start of the each blockEssentially we can deconstruct our locations function into however many properties we want to display If we wanted to only show country we could write each locations as country i We can even entirely leave out i and just write each locations as country city Using each Logic in SvelteLet s look at an example Below I ve turned our location list into HTML using each logic lt script gt let locations country UK city London country India city Mumbai country France city Paris lt script gt lt div id data gt each locations as country city i lt div class county city gt i country city lt div gt each lt div gt lt style gt data padding rem border radius px border px solid eee lt style gt The above code will produce the following output UK London India Mumbai France ParisSvelte makes this super easy and gives us the flexibility to leave data out For example here is an example displaying country only with no index lt script gt let locations country UK city London country India city Mumbai country France city Paris lt script gt lt div id data gt each locations as country lt div class county city gt country lt div gt each lt div gt lt style gt data padding rem border radius px border px solid eee lt style gt The above code will produce the following output UKIndiaFrance Keyed each Blocks in SvelteIf we want to access the entire object Svelte also gives us that option For example instead of writing each locations as country city i we can simply write each locations as location i Let s look at an example The code below works exactly the same as previous examples but instead of directly referencing country we write location country Both ways are the same so it s up to you which you prefer to use Some people prefer this version due to its simplicity as you don t have to redefine constantly which elements from the object to use lt script gt let locations country UK city London country India city Mumbai country France city Paris lt script gt lt div id data gt each locations as location i lt div class county city gt i location country location city lt div gt each lt div gt lt style gt data padding rem border radius px border px solid eee lt style gt Keyed each Statements in SvelteIf you already have predefined identifiers you can pass these to Svelte along with the index The benefit of this is that if data should be updated or changed you retain a strong link between the original data set and Svelte will use it to diff the list if data changes rather than adding or removing something to the end A good way to think about this is that if you provide a key you are telling Svelte to map HTML elements to that key If the item with the key is removed it will remove the DOM element If the properties change for that key then Svelte will update DOM element If instead you don t provide a key Svelte goes off the array index That can lead to some problems if we remove an element from the start for example the first DOM element will simply be updated with properties from the new first item in the array As such it can be useful to use a unique key How to use Keyed each Statements in Sveltelet locations id country UK city London id country India city Mumbai id country France city Paris We could then define id as our unique key for each item If we wanted to do that it d look like this each locations as location i location id Or if we re deconstructing we could also write it like this each locations as id country city i id Or finally we can remove i altogether and simply refer to id This will let us use our id in code and also link the DOM element to our id property each locations as id country city id Using this in code we can still achieve the same results but we will have hard linked our DOM elements to particular array elements lt script gt let locations id country UK city London id country India city Mumbai id country France city Paris lt script gt lt div id data gt each locations as location i location id lt div class county city gt i location country location city lt div gt each lt div gt lt style gt data padding rem border radius px border px solid eee lt style gt Each else statements in SvelteWe can define default behaviour for an empty list using the else statement Simply define your each statement as normal and add it in after It will only display if the list is empty Here is an example lt div id data gt each locations as location i location id lt div class county city gt i location country location city lt div gt else lt div class empty list gt No items to show lt div gt each lt div gt 2022-04-04 17:07:45
海外TECH DEV Community Microsoft Hackathon https://dev.to/migueldsalmeida/building-the-future-egb Microsoft HackathonMicrosoft and Galp have teamed up to create Building the Future Hackathon Develop a low carbon future and get a chance to win € Register at TAIKAI 2022-04-04 17:07:02
Apple AppleInsider - Frontpage News Apple TV+ lands Harrison Ford for starring role in 'Shrinking' https://appleinsider.com/articles/22/04/04/apple-tv-lands-harrison-ford-for-starring-role-in-shrinking?utm_medium=rss Apple TV lands Harrison Ford for starring role in x Shrinking x Star Wars and Indiana Jones icon Harrison Ford has signed with Apple TV to star in Shrinking a new comedy from the makers of Ted Lasso Harrison Ford Source Gage Skidmore Harrison Ford began his acting career on television with guest roles in series from The Virginian to Petrocelli Other than appearing as himself in a TV movie and a cameo in s The Young Indiana Jones Chronicles he hasn t been on television since The Star Wars Holiday Special in Read more 2022-04-04 17:00:55
海外TECH Engadget DJI made a $329 clip-on mic for your vlogs https://www.engadget.com/dji-mic-clip-on-lapel-175000217.html?src=rss DJI made a clip on mic for your vlogsDJI is better known for its drones and cameras than any of its audio tech but it s apparently eager to change your mind The company has released the Mic its first dedicated audio recording gear in the US The wireless clip on system promises bit KHz audio capture for your vlogs or other spoken word content at distances of up to feet That s not too special in itself but DJI is clearly hoping to snag wireless earbud fans with the design ーyou charge the transmitters and receiver in a battery case that provides a total of hours of use You re looking at up to hours of use per session The Mic can output through a mm jack Lightning and USB C and you ll get familiar audio adjustments like sensitivity between dB and dB and variable gain An included furry windscreen will prevent a blustery day from ruining your show The system is available now for That s a lot to spend if you re just looking to record audio using your phone You can spend a fraction of the price if you only need the basics However the outlay might be easier to rationalize if you either depend on long distance recording or want the flexibility that DJI s charging case and output selection can offer 2022-04-04 17:50:00
海外TECH Engadget Amazon's Prime Video and IMDb TV are staying on Roku https://www.engadget.com/roku-amazon-prime-video-imdb-tv-deal-172436208.html?src=rss Amazon x s Prime Video and IMDb TV are staying on RokuRoku users who might be used to major third party services disappearing for a while or taking forever to arrive won t lose access to Prime Video and IMDb TV any time soon Amazon and Roku have reached a multi year deal to keep the apps on the platform “Roku and Amazon have reached a multi year extension for their distribution agreement Roku said in a statement “Customers can continue to access the Prime Video and IMDb TV apps on their Roku devices It didn t disclose terms of the deal The Amazon negotiations seem to have gone more smoothly than talks with other streaming services It took months for Roku to reach a deal with WarnerMedia to get HBO Max on the platform As for YouTube TV that vanished from Roku for eight months The company and Google aired their grievances in public while working on a new deal Securing all those agreements means users will have access to more of the streaming services they might want to use under one umbrella 2022-04-04 17:24:36
海外科学 NYT > Science Holy Cross to Rename Science Complex for Fauci https://www.nytimes.com/2022/04/04/us/fauci-holy-cross.html anthony 2022-04-04 17:53:26
海外科学 NYT > Science 5 Takeaways From the U.N. Report on Limiting Global Warming https://www.nytimes.com/2022/04/04/climate/ipcc-report-explained.html levels 2022-04-04 17:26:08
金融 金融庁ホームページ 金融庁職員の新型コロナウイルス感染について公表しました。 https://www.fsa.go.jp/news/r3/sonota/20220404.html 新型コロナウイルス 2022-04-04 18:40:00
ニュース @日本経済新聞 電子版 中国・上海、都市封鎖を延長へ コロナ感染拡大止まらず https://t.co/2lwMJ4TlBO https://twitter.com/nikkei/statuses/1511036808666431488 感染拡大 2022-04-04 17:43:43
ニュース @日本経済新聞 電子版 片手で見る縦画面ドラマ 隙間時間で得られる没入感 https://t.co/PvJyYro1kS https://twitter.com/nikkei/statuses/1511036038457675780 隙間時間 2022-04-04 17:40:39
ニュース @日本経済新聞 電子版 米国連大使、人権理事会からロシア追放要請 手続き開始 https://t.co/u1HBhtEX4W https://twitter.com/nikkei/statuses/1511031534840872960 国連大使 2022-04-04 17:22:46
ニュース @日本経済新聞 電子版 縦に「刺さるコンテンツ」 SHOWROOM・前田裕二氏 https://t.co/2pmOLlUsdB https://twitter.com/nikkei/statuses/1511026488900222978 showroom 2022-04-04 17:02:43
ニュース @日本経済新聞 電子版 Amazonスポーツ配信、4例目のワクチン、初任給上げ https://t.co/05DY9BTzKb https://twitter.com/nikkei/statuses/1511026486438162432 amazon 2022-04-04 17:02:42
ニュース BBC News - Home Ukraine war: UK will not rest until justice is done, says Johnson https://www.bbc.co.uk/news/uk-60985682?at_medium=RSS&at_campaign=KARANGA graves 2022-04-04 17:26:16
ニュース BBC News - Home Cryptocurrency: UK Treasury to regulate some stablecoins https://www.bbc.co.uk/news/business-60983561?at_medium=RSS&at_campaign=KARANGA cryptocurrency 2022-04-04 17:15:17
ビジネス ダイヤモンド・オンライン - 新着記事 宅建をとると年収が上がる! 3つのメリットを徹底解説! - 大量に覚えて絶対忘れない「紙1枚」勉強法 https://diamond.jp/articles/-/300996 資格 2022-04-05 02:50:00
ビジネス ダイヤモンド・オンライン - 新着記事 ロシアのウクライナ侵攻、トルコが仲介国になった理由 - 経済は地理から学べ! https://diamond.jp/articles/-/300982 経済 2022-04-05 02:45:00
ビジネス ダイヤモンド・オンライン - 新着記事 NATOの対ロシア最前線、ルーマニアの基地急変貌 - WSJ発 https://diamond.jp/articles/-/301031 最前線 2022-04-05 02:05:00
GCP Cloud Blog How Managed Security Service Providers can accelerate their business with Google Cloud Security’s Partner Program using Google Chronicle https://cloud.google.com/blog/products/identity-security/launch-of-the-google-cloud-securitys-mssp-partner-program-with-google-chronicle/ How Managed Security Service Providers can accelerate their business with Google Cloud Security s Partner Program using Google ChronicleManaged Security Service Providers MSSPs can deliver high value security services for customers helping to drive efficiencies in security operations across people product and processes In an environment where the threat landscape continues to be challenging MSSPs can allow customers to scale their security teams driving enhanced security outcomes At the same time MSSPs operating their own SOC team can face challenges from core operating capabilities around an increasing number of alerts to the shortage of skilled security professionals to the highly manual and “tribal knowledge investigation and response approach MSSPs are generally constantly looking at opportunities to enhance customer satisfaction while providing advanced security operations capability  To help we are excited to announce our new Chronicle MSSP Program which will offer MSSPs around the world the ability to provide scalable differentiated and effective detection and response capabilities with our cloud native SIEM product Chronicle In a highly competitive environment where customers have little to differentiate between various MSSP providers we are helping to turbocharge our MSSP partners with specialized services offerings enabling branded portals and advanced threat detection investigation and response capabilities   We are proud to partner with Google Cloud Security to solve functional challenges that exist in security for our customers As a major partner and a distributor MSSP we are excited to leverage this new program helping our customers and delivering security outcomes ーRobert Herjavec CEO Herjavec Group and Fishtech GroupOur partners can help drive success for their business with  Google scale partnership support to help grow your business Go to market with a team that are incentivized to sell your solution Help unlock greenfield accounts and expand into new territories quickly    More controls over margins and easy straight forward pricing The modern licensing model gives MSSPs advanced control over their margins   Building differentiated solutions that demonstrate your expertise Chronicle MSSPs can add their solution on Chronicle to help make their solution both unique in the market and easier to sell MSSPs can drive additional leverage with branded reporting unique solutions and advanced threat intelligence Additionally our partners are able to utilize key technical differentiators in Chronicle to help drive value for customers  API driven multi tenancy We can make it easier for you by helping to streamline and automate customer management workflows and enable the delivery of fully featured instances in a few API calls Ingest everything helping to ensure no more blindspots Chronicle is designed to ingest data from any cloud even the voluminous datasets e g EDR NDR Cloud This ability can enable security data to exist in one place and perhaps more importantly aliased and correlated into a timeline of events This capability can enable SOCs to begin to operationalize their data into meaningful signals Help prioritize threats and quickly respond to alerts with context aware detections With context aware detections in Chronicle the supporting information from authoritative sources e g CMDB IAM and DLP including telemetry context relationships and vulnerabilities are available as a “single detection event Our partners can use this capability to write context driven detections prioritize existing alerts and drive fast investigation Simply put Google Chronicle will help reduce the MTTR mean time to respond for our partners by helping to minimize the need to wait for contextual understanding before making a decision and taking an investigatory action which can lead to greater customer and cost benefits  We have partners already using the Chronicle MSSP program Our partners like CYDERES Netenrich and Novacoast among others have used this program to help accelerate customers security operations modernization journeys  We at Google Cloud are helping to drive innovations that are foundational to security operations and helping our partners support customers effectively The Chronicle MSSP Program builds on the momentum of our MSSP program for VirusTotal which can provide our partners with world class crowdsourced threat intelligence  To learn more about the Chronicle and VirusTotal MSSP programs register for our MSSP webinar  For more information about the Chronicle MSSP Program contact us at gcsecurity mssp google com Additionally learn more about our VirusTotal MSSP programRelated ArticleIntroducing Community Security AnalyticsIntroducing Community Security Analytics an open source repository of queries for self service security analytics to help you get starte Read Article 2022-04-04 17:30:00

コメント

このブログの人気の投稿

投稿時間:2021-06-17 05:05:34 RSSフィード2021-06-17 05:00 分まとめ(1274件)

投稿時間:2021-06-20 02:06:12 RSSフィード2021-06-20 02:00 分まとめ(3871件)

投稿時間:2020-12-01 09:41:49 RSSフィード2020-12-01 09:00 分まとめ(69件)