投稿時間:2023-08-17 03:20:53 RSSフィード2023-08-17 03:00 分まとめ(24件)

カテゴリー等 サイト名等 記事タイトル・トレンドワード等 リンクURL 頻出ワード・要約等/検索ボリューム 登録日
AWS AWS for SAP Automate incident management for SAP on AWS https://aws.amazon.com/blogs/awsforsap/automate-incident-management-for-sap-on-aws/ Automate incident management for SAP on AWSIntroduction Amazon CloudWatch Application Insights helps you monitor your applications that use Amazon Elastic Compute Cloud EC instances along with other underlying AWS resources It identifies and sets up key metrics logs and alarms across your application resources and technology stack by continuously monitoring metrics and logs to detect and correlate anomalies and errors When … 2023-08-16 17:35:25
海外TECH Ars Technica $220 billion is helping build US cleantech infrastructure. Here are the projects. https://arstechnica.com/?p=1961256 large 2023-08-16 17:17:19
海外TECH MakeUseOf 6 Sites to Find Entertainment Posters https://www.makeuseof.com/entertainment-poster-websites/ posters 2023-08-16 17:31:24
海外TECH MakeUseOf How to Create a Facebook Cover Photo Using Your Avatar https://www.makeuseof.com/how-to-create-facebook-cover-photo-with-avatar/ facebook 2023-08-16 17:16:22
海外TECH MakeUseOf The 7 Best Encryption Apps for Windows https://www.makeuseof.com/best-encryption-apps-for-windows/ encryption 2023-08-16 17:01:22
海外TECH DEV Community build your own FAAS provider https://dev.to/saphidev/build-your-own-faas-provider-2gca build your own FAAS provider What is this is a command line interface CLI tool and serverless FAAS for creating serverless applications with random subdomains It simplifies the process of setting up serverless functions using AWS services apotox slsdz build your own serverless FAAS provider on AWS SLSDZ build your own FAAS providerDescriptionslsdz is a command line interface CLI tool and serverless FAAS for creating serverless applications with random subdomains It simplifies the process of setting up serverless functions using AWS services video mp InstallationYou can try the cli yarn installyarn bundlecd demo mycli js help Note mycli js is a shortlink to the bundled CLI file cli jsOptions help Show help version Show version number id function id secret function secret init init new function id and secret f zip upload a function package zip v verbose verbose level log function log about aboutNote The id and secret options are required you can initialize new function using the init option also you can pass these two options as an env variables SFUNCTION ID and SFUNCTION SECRET Examples simple index js functionmodule exports handler async event context… View on GitHubthis project is for learning and see how to use Terraform and AWS to build a serverless solution its not the perfect way to build FAAS specialy for the cost calling one AWS Lambda function from another Lambda function often referred to as double calling can indeed impact the cost and performance of your serverless application When you invoke one Lambda function from another you incur additional execution time network latency and potential resource usage which can lead to increased costs and slower response times InstallationYou can use slsdz using npx yarn installyarn bundlecd demo mycli js help Options help Show help version Show version number id function id secret function secret init init new function id and secret f zip upload a function package zip v verbose verbose level log function log about aboutNote The id and secret options are required you can initialize new function using the init option also you can pass these two options as an env variables SFUNCTION ID and SFUNCTION SECRET Examples simple index js functionmodule exports handler async event context gt return statusCode body Hello World bashzip function zip index js mycli js init a slsdz file will be generated which contains the function credentials mycli js zip function zip PM API URL your function URL file size bytes PM UPLOAD STATUS uploading Users mac test function zip How it works AWS services Lambda functions ApiGateway handle http requests S store functions codes CloudFormation create users functions CloudWatch logs EventBridge trigger function to create CNAME records external services Cloudflareto interact with the service developers use a CLI tool called slsdz without the need for any AWS related authentication Each user can initialize a function and receive an ID and a secret The secret acts as an ID signature which is used for uploading or updating a function These data are saved in a local file called slsdz The slsdz CLI communicates with the serverless backend where functions can be created updated and their logs can be retrieved This backend utilizes API Gateway with Lambda integrations to manage the interactions When a user uploads function code a Lambda function called Signer comes into play The Signer generates a signed URL function id zip that allows users to upload the function code to an S bucket i putted the function id zip in the URL intentionally so i can use it in the next step when handling the s ObjectPut event The S bucket is configured to trigger another function called Deployer when an ObjectPut event occurs The Deployer function reads the uploaded zip file and it expects the zip file to be named like function id zip where function id represents the unique ID of the function This naming convention allows the Deployer lambda function to determine which function should be updated created if the function is new doesnt exists the Deployer function will build a cloudformation template that has all the required parameters to create add new api mappingadd new custom domain to apigateway custom domainsa new lambda function with basic AMI role resource aws iam role user lambda role name basic lambda role local stage name assume role policy lt lt EOF Version Statement Action sts AssumeRole Principal Service lambda amazonaws com Effect Allow EOF stack name has function id so we can extract it easly when handling the creation event in cname lambda function const stackName sls stack functionId Date now in AWS console it will looks like When the CloudFormation stack successfully creates the resources it publishes a Stack Creation Event to Amazon EventBridge Where there is a Lambda function called cname that is subscribed to the Stack Creation Event in EventBridge The cname Lambda function represent an integration with the Cloudflare API It uses the functionId from the received EventBridge event to create a new CNAME record on Cloudflare After the Lambda function is triggered and successfully creates the CNAME record through the Cloudflare API you will see a new record added to your Cloudflare dashboard with the same functionId that was used in the Lambda function Project StructureThe project structure consists of three main components sls lambda sls cli and infra all of which have been implemented using TypeScript and Terraform for Infrastructure as Code IaC Yarn workspaces are utilized to facilitate the sharing of common dependencies between projects and enable running the test and bundle commands from a unified location Project sls lambda sls cli infrasls lambda This part of the project comprises separate folders each representing a lambda function Each folder contains an index ts file that serves as the entrypoint or function handler for that specific lambda function sls cli The sls cli component is implemented using Yargs a popular command line argument parser This module provides the Command Line Interface CLI for the project allowing users to interact with the serverless FaaS provider and deploy their functions infra The infra directory holds the Terraform code for managing the Infrastructure as Code IaC Specifically the dev subdirectory contains all the required resources for the development environment Each lambda function has its dedicated tf file which begins with the prefix fn making it easy to locate and manage individual functions infrastructure inside this infra directory you can see also the data tf which contain the part of zipping function bundles and generate release zip files and also this tf file define all the required Policies aws iam policy resources locals functions cname deployer signer logger sls proxy locals lambda files for fn in local functions fn gt source file path module dist fn bundle js output path path module release fn zip data archive file zipped for each local lambda files depends on null resource bundle type zip source file each value source file output path each value output path each policy resource definition is declared in a json template inside templates by using templatefile function it can populate the policy attributes from json content for example in the fs cname tf file you found the usage of a data source called zipped in the aws lambda function resource resource aws lambda function cname depends on null resource bundle function name var project name var stage name cname filename data archive file zipped cname output path Deploy on your own aws accountgit clone ityarn install before deploying you need to set up some variables export AWS ACCESS KEY ID export AWS SECRET ACCESS KEY export TF VAR custom api domain name api example comexport TF VAR signing secret used to sign the function idexport TF VAR cloudflare zone id export TF VAR cloudflare email export TF VAR cloudflare api key export TF VAR certificate arn and you need to add your own domain name to cloudflare example api example de and create new CNAME record point to your apigateway edit infra globals s main tf and comment the S backend we need to do this onetime because the s bucket doesnt exists yet it will be created in the next step terraform backend s bucket sls lambda terraform state key global s terraform tfstate region us east dynamodb table sls lambda terraform locks encrypt true now runterraform chdir infra globals s initterraform chdir infra globals s apply auto approve edit infra globals s main tf and uncomment the S backend sectionterraform backend s bucket sls lambda terraform state key global s terraform tfstate region us east dynamodb table sls lambda terraform locks encrypt true now we can enable remote backend by running again init and applyterraform chdir infra globals s initterraform chdir infra globals s apply auto approvein infra dev infra dev contains all files required to setup an envirenmnt in infra dev main tf you can see backend section has a diffrent key dev terraform tfstate to deploy diffrent env stage for example prod you can copy the dev folder and name it prod and set the key to prod terraform tfstate terraform backend s bucket sls lambda terraform state key dev terraform tfstate lt this should be unique for each dev envirenmnt region us east dynamodb table sls lambda terraform locks encrypt true terraform chdir infra dev initterraform chdir infra dev apply auto approve Update CLI consts slsdz cli src consts tsexport const SLSDZ CONFIG FILENAME slsdz export const SLSDZ DOMAIN NAME example de export const SLSDZ API BASE URL process env SLSDZ API BASE URL Summaryslsdz means serverless and dz is a country abbreviation that most of Algerian developers love to add to their project so i did this small project is a developer friendly serverless FaaS provider that simplifies the deployment of functions to AWS accounts using a CLI tool and a secure signature based authentication method The deployment process involves utilizing a signer function to generate a signed URL for S uploads which then triggers a deployer function to read and update the respective function based on the provided ID in the zip file name project still needs lot of work ex Monitoring Logging to be a prod ready but i think its not that hard specially with AWS services and integrations if you have any additional ideas or question just ping me 2023-08-16 17:40:17
海外TECH DEV Community The significance of performance testing for retail and e-commerce apps https://dev.to/williamamanda23/the-significance-of-performance-testing-for-retail-and-e-commerce-apps-1e89 The significance of performance testing for retail and e commerce appsThe importance of e commerce and retail apps has grown significantly due to their convenience personalization and seamless shopping experiences As businesses adopt a mobile first approach these apps engage and retain customers offer omnichannel experiences and utilize push notifications for marketing With a global reach e commerce and retail apps have become crucial tools for businesses to stay competitive and drive sales in the digital age The performance of retail and e commerce apps should be flawless In today s competitive market users expect smooth and efficient online shopping experiences Slow loading times crashes or other performance issues can lead to user frustration and abandoned transactions Ensuring seamless and fast performance is vital for these apps to retain customers drive sales and stay ahead of the competition App performance monitoring and testing are essential to proactively identify and address any issues providing users with the best shopping experience This blog will cover the importance of retail and e commerce performance testing the KPIs to be monitored and how HeadSpin enables enterprises to streamline testing and deliver exceptional customer experiences Importance of performance testing for retail applicationsRetail and e commerce apps serve as the primary touchpoint between businesses and their customers making flawless performance essential for a positive user experience Here are the key reasons why performance testing for retail applications is crucial Customer satisfaction High performing retail apps provide smooth navigation fast response times and quick loading of product pages This translates to a satisfying user experience increasing customer retention and loyalty Brand reputation A poorly performing app can tarnish a brand s reputation and lead to negative reviews and word of mouth publicity On the other hand a fast and reliable app enhances a brand s image fostering customer trust Revenue generation Slow or buggy apps can result in abandoned carts and lost sales opportunities Retail and e commerce performance testing help identify and resolve bottlenecks ensuring the app efficiently handles peak loads and transactions User engagement Responsive and seamless app experiences keep users engaged and encourage them to explore products and make purchases This leads to increased conversions and higher revenue potential Competitive edge In the crowded retail market superior app performance sets a brand apart A well optimized app attracts and retains customers giving the business a competitive advantage Scalability Retail apps need to handle varying levels of user traffic especially during seasonal sales or promotions Retail and e commerce performance testing for retail applications and other e commerce software assesses the app s scalability ensuring it can accommodate increased loads without crashing Cost effectiveness Identifying and fixing performance issues early in the development cycle is more cost effective than addressing them after the app is live Performance testing helps catch potential problems before they impact the end users Security Performance testing for retail applications also includes stress testing to assess the app s stability under extreme conditions This indirectly contributes to security by ensuring the app doesn t crash or become vulnerable to attacks Customer retention A positive app experience encourages customers to return for future purchases By focusing on performance retail apps can increase customer retention and drive repeat business The significance of KPIs in gauging retail and e commerce app performanceIn today s competitive landscape flawless performance of retail and e commerce apps is paramount to success KPIs Key Performance Indicators play a pivotal role in achieving this by providing measurable metrics to assess various aspects of the app s functionality and user experience Here s why KPIs are important in this context Performance monitoring KPIs allow retailers to monitor critical performance metrics such as response time page load speed and error rates By tracking these KPIs retailers can identify performance bottlenecks and areas that need improvement User experience improvement KPIs provide insights into the app s usability and responsiveness helping retailers understand how users interact with the app Positive user experiences lead to improved conversion rates higher customer satisfaction and increased engagement Stability KPIs like transaction throughput and peak user load handling assess the app s ability to handle increased user traffic during peak seasons or promotions This ensures the app remains stable and responsive even under heavy load Identifying weak points By analyzing KPIs retailers can pinpoint weak points in the app s infrastructure or design This enables them to proactively address issues and optimize the app s performance before they impact user experience negatively Optimizing resource utilization KPIs related to resource utilization such as CPU and memory usage help retailers optimize their app s performance Efficient resource management leads to reduced costs and improved overall app performance Data driven decision making KPIs provide objective data for making informed decisions related to app improvements Retailers can prioritize development efforts based on the KPIs that significantly impact performance and user experience Competitive advantage Monitoring KPIs enables retailers to benchmark their app s performance against competitors By outperforming competitors in terms of app speed and user experience retailers can gain a competitive edge Enhancing conversion rates A high performing app positively influences conversion rates Faster load times and smoother checkout processes can lead to higher sales and revenue What are the critical KPIs that retail and e commerce performance testing helps measurePerformance testing for retail and e commerce apps involves measuring several Key Performance Indicators KPIs to assess their overall performance and ensure a positive user experience Some of the essential KPIs that performance testing helps measure for these apps include Average response time The time taken by the app to respond to user actions such as loading product pages or completing transactions Transaction throughput The number of transactions the app can handle per unit of time indicating its capacity to support concurrent users Error rate The percentage of errors or failures encountered during testing indicating the app s stability and reliability Page load time The time taken for web pages to load including product listings images and checkout pages Peak user load handling The maximum number of concurrent users the app can handle without a significant deterioration in performance Resource utilization Monitoring CPU memory and network usage to ensure efficient resource utilization Latency The time taken for data to travel between the user s device and the app s server influencing the app s responsiveness Database performance Assessing the efficiency of database queries and data retrieval processes Network performance Evaluating the app s performance under different network conditions to ensure a consistent user experience By measuring these KPIs performance testing provides valuable insights into the app s performance scalability and reliability It helps identify potential issues optimize the app s performance and ensure that retail and e commerce apps deliver a flawless user experience even during peak usage periods Types of performance tests performed on retail appsImagine your retail app facing a surge of users during a holiday sale Different performance tests help brands identify and fix predictable and unpredictable issues to ensure the perfect performance of the apps E commerce performance testing involves a diverse range of tests each with a specific focus to ensure the optimal functioning of the application Let s dive into these tests and explore how they contribute to creating a seamless shopping experience Load testing This test evaluates how well the app performs under both expected and peak user loads By simulating different levels of user traffic load testing helps identify potential performance bottlenecks such as slow response times or server overloads It ensures that the app is able to handle the expected user base without compromising performance Stress testing Stress testing pushes the app to its extreme limits by subjecting it to heavy user loads and resource constraints The goal is to determine the app s stability and responsiveness under challenging conditions Stress tests reveal how the app behaves during periods of high demand and helps identify any weaknesses or points of failure Endurance testing This assesses the app s performance over an extended period typically for hours or days The objective is to identify memory leaks resource utilization issues or performance degradation that may occur over time Endurance testing ensures that the app remains stable and reliable during prolonged usage Scalability testing As retail businesses grow their app must handle increased user and transaction volumes Scalability testing evaluates the app s ability to scale up or down seamlessly It helps ensure that the app can accommodate a growing user base without compromising performance or user experience Spike testing Spike testing examines how the app responds to sudden significant spikes in user activity or traffic It helps determine if the app can handle sudden surges in demand without crashing or experiencing performance issues This is particularly important during peak shopping seasons or special promotions Security testing Retail apps deal with sensitive customer data and financial transactions Security testing assesses the app s ability to withstand security breaches and ensures that user data remains protected even during high user loads It helps identify vulnerabilities and potential risks that could compromise user trust and data integrity Network latency testing This type of testing evaluates how app performance is affected by network delays and poor connectivity Network latency can significantly impact the app s response times especially for users in areas with weaker network connections This test helps optimize the app s performance under varying network conditions Transaction testing Retail apps process numerous transactions daily Transaction testing verifies the app s capability to handle a high volume of transactions smoothly and efficiently It ensures that order processing payment transactions and inventory management functions work flawlessly even during peak times Concurrent user testing This test examines how well the app handles multiple users accessing it simultaneously It evaluates the app s ability to manage concurrent sessions without performance degradation or user conflicts Concurrent user testing helps guarantee a smooth and responsive experience for all users regardless of the user load How does HeadSpin help streamline performance testing for retail and e commerce brands Delivering perfect digital experiences to customers is a key priority for retailers and HeadSpin s data science driven retail app testing Platform is the right matched solution to address it HeadSpin offers a comprehensive Platform to test retail and e commerce apps and optimize their performance for rendering exceptional customer experiences With deep ML models the Platform helps pinpoint high priority issues and resolve them quickly The data science capabilities enable QA and testing teams to capture critical business specific KPIs that degrade the user experience and thereby optimize app performances Key HeadSpin CapabilitiesBy leveraging the unique features of HeadSpin s Platform businesses are able to deliver optimal performance and user experience for their retail applications The solution offers a range of unique benefits including End to end monitoring Businesses can analyze end to end scenarios to ensure proper integration and identify potential bottlenecks or issues in the application flow HeadSpin enables customized customer experiences and proactively monitors and detects errors ensuring a seamless shopping journey for users Performance regression Retail apps often undergo updates and changes which can impact their performance HeadSpin helps businesses monitor performance over time to detect any regressions or degradation in app performance This proactive approach ensures that the app consistently delivers a high quality experience to users Actionable insights HeadSpin provides visualization of comparative views of UX and performance for peer applications These actionable insights help enterprises make data driven decisions to improve their retail apps continually UX benchmarking HeadSpin s solution enables retailers to perform UX benchmarking By tracking critical user journeys such as login times product searches and checkout processes businesses can obtain valuable insights into user behavior and optimize the app s user experience accordingly Testing on various devices HeadSpin allows retailers to test their applications on a diverse set of devices including different smartphones tablets and operating systems This ensures that the app functions optimally across a wide range of devices meeting the needs of a diverse user base HeadSpin s global device infrastructure allows teams to test their apps on real devices in different locations and address varying consumer requirements across different regions With this businesses can ensure consistent and reliable performance worldwide Bottom linePerformance optimization is of paramount importance for retail apps in today s highly competitive digital landscape As the demand for modern applications continues to soar delivering flawless customer experiences is no longer optionalーit s a business imperative Retailers must ensure their apps perform optimally offering seamless navigation fast loading times and reliable functionality across various devices and locations Delivering a high performing retail app fosters customer loyalty satisfaction and brand credibility It sets retailers apart from the competition allowing them to thrive in a digital first world With the right retail and e commerce software testing tools and strategies retail apps can remain at the forefront of the industry and continue delighting customers with exceptional digital experiences 2023-08-16 17:20:39
海外TECH DEV Community Migrate from WordPress to a headless CMS in 3 easy steps https://dev.to/tonyspiro/migrate-from-wordpress-to-a-headless-cms-in-3-easy-steps-20fp Migrate from WordPress to a headless CMS in easy stepsIn this article I m going to show you how to migrate from WordPress to a headless CMS in easy steps To make this quick and simple we will be using the Cosmic WordPress Importer extension to import posts using any available WordPress RSS feed This article assumes that you already have at least a cursory knowledge of both WordPress and headless CMS If you don t understand these concepts I recommend you learn about headless CMS and check out the WordPress vs Headless CMS comparison page TL DRSign up for CosmicInstall the WordPress extensionView the source code to see how the extension is built WordPress fanboys need not applyWordPress powers about of the web That s a lot of websites that are suffering from the same problems Some people love WordPress this article is not for them This article is for people who are fed up with the issues that come with one of the world s most popular content management systems and are ready to go headless If this applies to you let s get started Create a new Cosmic projectFirst create a free Cosmic account then set up a new project Install the ExtensionGo to the Extensions tab then find the Cosmic WordPress Importer extension and install it Migrate your contentAfter you install the extension you will be asked to add a feed URL to import the posts A feed URL is any publicly available WordPress RSS feed something like Select how may posts you want to install and click Submit Automatically the WordPress posts will populate into Cosmic Refresh the page to see links to the posts available in Cosmic Your posts are now available via a simple API endpoint The JSON data payload will look something like this objects slug angels who help you fly title Angels who help you fly metadata content lt p gt Paddling out at Pacifica Beach on an unusually warm beautiful day I noticed a snippet Paddling out at Pacifica Beach on an unusually warm beautiful day I noticed a guy author Jane Mariechen categories Surfing California slug trust yourself title Trust yourself metadata content lt p gt I sat on the sidelines with anxiety as I watched the snippet I sat on the sidelines with anxiety as I watched the big waves from a distance author John Marieche categories Surfing Hawaii You can now deliver your content to any website or application FREEDOM Bonus pointsLuckily the Cosmic WordPress Importer extension code is available open source so you can customize the data that it pulls in You can also learn how to build your own Cosmic extension ConclusionI hope you found this article helpful Sign up for Cosmic and reach out to Cosmic support if you have any questions or need dedicated migration support 2023-08-16 17:17:00
海外TECH DEV Community How to create Sitemap.xml for ASP.net Core Razor Pages https://dev.to/xakpc/how-to-create-sitemapxml-for-aspnet-core-razor-pages-38fl How to create Sitemap xml for ASP net Core Razor PagesRecently I wanted to create a sitemap for my Razor Page web application Adding a sitemap to a website is a relatively straightforward process but I found out that many examples over the web are a bit outdated So I decided to document how I added it The following information is related to Microsoft AspNetCore App version In general creating a sitemap xml for web applications can significantly enhance their search engine visibility A sitemap provides a roadmap for search engine bots enabling them to index your website content more efficiently Step Understand the Structure of sitemap xmlA sitemap XML file lists URLs for a site along with additional metadata about each URL when it was last updated how often it changes and its importance relative to other URLs Here is a very basic XML sitemap that includes the location of a single URL lt xml version encoding UTF gt lt urlset xmlns gt lt url gt lt loc gt lt loc gt lt lastmod gt lt lastmod gt lt url gt lt urlset gt There are more parameters defined in the protocol specification but Google claims to ignore them so I think they could be omitted Google ignores lt priority gt and lt changefreq gt values Google uses the lt lastmod gt value if it s consistently and verifiably for example by comparing to the last modification of the page accurate Step Create a Model for the SitemapFirst I created a model for the sitemap node This model will represent individual URLs their priority and other metadata public class SitemapNode public SitemapFrequency Frequency get set public DateTime LastModified get set public double Priority get set public string Url get set public enum SitemapFrequency Never Yearly Monthly Weekly Daily Hourly Always There is another approach if I wanted to use serialization which would look a bit clearer but it takes twice as many lines of code so I skipped it XmlRoot urlset Namespace public class SitemapUrlSet XmlElement url public List lt SitemapNode gt SitemapNodes get set new List lt SitemapNode gt public class SitemapNode XmlElement loc public string Url get set XmlElement lastmod public DateTime LastModified get set XmlElement changefreq public SitemapFrequency Frequency get set XmlElement priority public double Priority get set public enum SitemapFrequency XmlEnum never Never XmlEnum yearly Yearly XmlEnum monthly Monthly XmlEnum weekly Weekly XmlEnum daily Daily XmlEnum hourly Hourly XmlEnum always Always Step Setting Up the Method to Generate Sitemap NodesI needed a service or method that will generate the sitemap nodes based on my website s content To generate URLs for Razor Pages you typically could use PageLink or LinkGenerator I used the last one public class SitemapModel PageModel private readonly LinkGenerator linkGenerator public SitemapModel LinkGenerator linkGenerator linkGenerator linkGenerator rest of the code Creating a sitemap for static pages is easier by hardcoding them Blog pages or other dynamic content are taken from a database or file system in my case based on where they are stored Method GetUriByPage from provides an absolute URL based on page name handy public class SitemapModel PageModel rest of the code public IReadOnlyCollection lt SitemapNode gt GetSitemapNodes var nodes new List lt SitemapNode gt new Url linkGenerator GetUriByPage HttpContext Index Priority new Url linkGenerator GetUriByPage HttpContext Tools CreateCode Priority new Url linkGenerator GetUriByPage HttpContext Legal Privacy Priority new Url linkGenerator GetUriByPage HttpContext Legal TermsOfService Priority foreach fill nodes from blog index return nodes Step Creating the Sitemap PageThen I added a new Razor Page Sitemap cshtml that will be responsible for generating the sitemap xml When users or search engine bots access this page it should return the sitemap in XML format This is an elegant trick we return the razor page Sitemap cshtml as an XML file page model Xakpc Project Pages SitemapModel Layout null Response ContentType text xml lt xml version encoding UTF gt Html Raw Model RawXmlData Another required step is to set up the app to return this file by sitemap xml path It is done in options of AddRazorPages method like this builder Services AddRazorPages AddRazorPagesOptions options gt options Conventions AddPageRoute sitemap Sitemap xml Step Formatting the XMLOn the sitemap Razor Page model I formatted the sitemap nodes into XML format For that I manually built XML file with XElements public class SitemapModel PageModel rest of the code lt summary gt Serializes to raw XML lt summary gt public string GetSitemapDocument IEnumerable lt SitemapNode gt sitemapNodes XNamespace xmlns var root new XElement xmlns urlset foreach var sitemapNode in sitemapNodes var urlElement new XElement xmlns url new XElement xmlns loc Uri EscapeUriString sitemapNode Url sitemapNode LastModified null null new XElement xmlns lastmod sitemapNode LastModified Value ToLocalTime ToString yyyy MM ddTHH mm sszzz sitemapNode Frequency null null new XElement xmlns changefreq sitemapNode Frequency Value ToString ToLowerInvariant sitemapNode Priority null null new XElement xmlns priority sitemapNode Priority Value ToString F CultureInfo InvariantCulture root Add urlElement var document new XDocument root return document ToString The other option is to use serialization if you map classes with attributes I skipped that part public string GetSitemapDocument IEnumerable lt SitemapNode gt sitemapNodes var sitemapUrlSet new SitemapUrlSet SitemapNodes sitemapNodes ToList var xmlSerializer new XmlSerializer typeof SitemapUrlSet using var stringWriter new StringWriterUtf using var xmlTextWriter XmlWriter Create stringWriter new XmlWriterSettings Indent true xmlSerializer Serialize xmlTextWriter sitemapUrlSet return stringWriter ToString And all that is left is to call methods in OnGet method and set bound property public class SitemapModel PageModel rest of the code lt summary gt Gets the raw XML data lt summary gt BindProperty SupportsGet true public string RawXmlData get set public void OnGet var nodes GetSitemapNodes RawXmlData GetSitemapDocument nodes Step Register the Sitemap with Search EnginesAfter the sitemap is successfully set up it s a good idea to register it with major search engines like Google and Bing This will ensure that they know your sitemap s existence and can crawl your website more effectively Google Use Google Search Console to submit your sitemap Bing Use Bing Webmaster Tools to submit your sitemap ConclusionFinal Sitemap xml lt urlset xmlns gt lt url gt lt loc gt lt loc gt lt priority gt lt priority gt lt url gt lt url gt lt loc gt lt loc gt lt priority gt lt priority gt lt url gt lt url gt lt loc gt lt loc gt lt priority gt lt priority gt lt url gt lt url gt lt loc gt lt loc gt lt priority gt lt priority gt lt url gt lt urlset gt As you can see setting up a sitemap xml in a Razor Pages application is straightforward Following the steps above and adding the appropriate code ensures your website is more visible and accessible to search engines 2023-08-16 17:16:34
海外TECH DEV Community "Pivoting to Tech from Biomedical Science": CodeNewbie Podcast S25E1 https://dev.to/codenewbieteam/pivoting-to-tech-from-biomedical-science-codenewbie-podcast-s25e1-khn quot Pivoting to Tech from Biomedical Science quot CodeNewbie Podcast SE We are back y all In the first episode of our th season of the CodeNewbie Podcast saronyitbarek talks about making a major career change and the significance of laying a strong foundation with with Marley Anthony Software Engineer at Bench Accounting codenewbie org Marley is a software engineer photographer and outdoor enthusiast based in Vancouver BC He love tech and spending time outside riding bikes or hiking Tune in to gain valuable perspectives on strategies for landing that all important internship fostering growth and embracing the ongoing pursuit of knowledge Listen on Apple PodcastsListen on SpotifyOr listen wherever you normally get your podcasts WOW we are happy to be back for another season Make sure to follow us on your preferred platform if you haven t already And beyond allーhappy coding y all 2023-08-16 17:16:22
海外TECH DEV Community Dependency Injection in Flutter https://dev.to/alvbarros/dependency-injection-in-flutter-598k Dependency Injection in FlutterIn this article I ll attempt to teach you what it is how to do it and why would you do it as well as providing examples and a link to a repo on GitHub where you can check the code and try it for yourself Now moving on According to Wikipedia In software engineering dependency injection is a design pattern in which an object or function receives other objects or functions that it depends on A form of inversion of control dependency injection aims to separate the concerns of constructing objects and using them leading to loosely coupled programs The pattern ensures that an object or function which wants to use a given service should not have to know how to construct those services Instead the receiving client object or function is provided with its dependencies by external code an injector which it is not aware of So in other words Instead of creating objects inside a class or method those objects are injected from outside The class does not need to know how to create the objects it depends on it just needs to know how to use them This generates code that is easier to test and is more maintainable Like anything in life DI comes with some Pros and Cons Pros Makes your code easier to test since you can just inject mocks in your classes Makes your code easier to maintain as changes to the implementation of the injected objects can be made without affecting the class or method that depends on them Cons DI can add more complexity to your project especially if done improperly Injecting dependencies can introduce performance overhead DI can introduce runtime errors such as null pointer exceptions if dependencies are not properly managed or injected The Car exampleSo let s start with some code Suppose you have a Car class that has an Engine class Car Engine engine const Car void start engine start Null reference exception For this Car to work you need a working Engine That however is another class that has a bunch of complexities and other requirements that do not concern the car itself Following the principles of dependency injection this is what we can do Constructor injectionThe dependencies are passed to a class through its constructor This pattern makes it clear what dependencies a class require to function and it ensures that the dependencies are available as soon as the class is created If we implement constructor injection in our Car class class Car final Engine engine const Car this engine void start engine start engine is not null Since Car engine is final and also required in the construcotr we make sure that it will never be null void main final engine Engine final car Car engine car start Adding more partsNow let s imagine that you re a car manufacturer and you are creating parts of a car Since cars are not only made of engines you now have this class structure Please note that I m not a car manufacturer and this is not all the parts a car needs class Car final Engine engine final List lt Wheel gt wheels final List lt Door gt doors final List lt Window gt windows Car this engine this wheels this doors this windows void start engine start void rollDownAllWindows for var w in windows w rollDown void openAllDors for var d in doors d open Since the engine is final and must be passed on in the constructor the class won t compile until you give it a working engine It doesn t make sense that your doors doesn t work until you have a working engine With the construction injection approach you re only able to have a Car instance after you have all the pieces already done and can not have an incomplete Car Setter injectionThe dependencies are set on a class through setter methods This pattern allows for more flexibility as the dependencies can be set or changed after the class is created Whenever you have an instance of Car you can just use setEngine to set an engine to the car This fixes the previous problem and we can now have a Car and later give it an engine class Car Engine engine List lt Wheel gt wheels List lt Door gt doors List lt Window gt windows Car this wheels this doors this windows this engine void setEngine Engine newEngine engine newEngine void start engine start Now all you have to do is call setEngine whenever your engine is ready to be placed in the car You also must add some validation so that you don t have runtime errors happening in your code For more information on how to properly prevent these issues take a look at Null safety in Dart Other types of dependency injectionThese other types will not be covered in this example so these are just introductions Interface injectionThe class implements an interface which defines the methods for injecting the dependencies This pattern allows for more abstraction and decoupling of the code as the class does not have to depend on a specific implementation of the interface Ambient contextYou may be familiar with the provider pub packageA shared context is used to provide the dependencies to the classes that require them This pattern can be useful in situations where multiple classes need access to the same dependencies Service locatorYou may be familiar with the get it pub package A central registry is used to manage and provide the dependencies to the classes that require them This pattern can make it easier to manage dependencies in large applications but it can also make the code more complex and harded to test Ok but why tho In one of my projects I needed to create an authentication layer so that my users can create accounts and authenticate themselves Since I was still deciding on which one to use since it needed to be free and easy to scale I created a dependency injection structure so that I can easily swap out whenever I d like to test another authentication service This is the structure that I ve got class AuthenticationRepository final AuthenticationProvider provider AuthenticationRepository this provider Future lt UserSession gt signIn String email String password return provider signIn email password then session if session null return session throw Failed to authenticate catchError error throw error This class has a method signIn that takes an user s email and password then give it to the corresponding provider It also returns an UserSession class responsible to store the current user s data and authentication token class UserSession final String username final String email UserSession required this username required this email String get sessionToken gt Take notice of AuthenticationRepository provider It s an instance of the class AuthenticationProvider Here s the configuration abstract class AuthenticationProvider Future lt UserSession gt signIn String email String password Since this class is abstract in order to create a repository that actually works you need to give it an implementation So I have created two classes FirebaseProvider and CognitoProvider These classes are responsible for managin user authentication with Firebase s and Cognito s APIs respectively There s a pub package for Firebase integration and also one for Cognito integration These packages however do not seamlessly fit into the AuthenticationProvider abstract class showed in this example So now in order to authenticate we just need to decide which one we want to use Imagine you have your AuthenticationRepository stored in a service locator such as GetIt setting up GetIt instance registerSingleton lt AuthenticationRepository gt AuthenticationRepository CognitoProvider authenticating an userfinal auth GetIt instance lt AuthenticationRepository gt auth signIn email password Testing exampleTo showcase how you can use DI to make better tests and mock classes easily here s an example of MockAuthenticationProvider that enables testing on AuthenticationRepository You can begin by creating the mocked provider class MockAuthenticationProvider implements AuthenticationProvider static String successPassword UserSession userSession MockAuthenticationProvider this userSession override Future lt UserSession gt signIn String email String password if password successPassword return Future value userSession else return Future value null Note that the class above has a static successPassword property This is so that we can implement success and failure methods but it is in no way necessary Feel free to implement any logic that you d like And now you can then create the mock factory AuthenticationRepository mockRepository final mockUserSession UserSession username mock email mock mail com sessionToken token final mockProvider MockAuthenticationProvider userSession mockUserSession return AuthenticationRepository mockProvider By using this AuthenticationRepository we can easily test its methods without needing to integrate with either Cognito nor Firebase Here s an example of a successful unit test test Should return a valid UserSession async final repo mockRepository final result await repo signIn email MockAuthenticationProvider successPassword assert result sessionToken null Note that we re trying to signin with an email and MockAuthenticationProvider successPassword which is a way to force the provider to return an UserSession Now testing for failures test Should throw if UserSession comes null from provider async final repo mockRepository try await repo signIn email incorrect password then userSession fail Should throw an exception catch error assert error toString Failed to authenticate EndingAnd that s it Thanks for reading through the end with this article This is my first here on dev to so feel free to leave any feedbacks Here s the source code once again Feel free to open an issue or comment down below See ya 2023-08-16 17:14:00
海外TECH DEV Community The Golang Saga: A Coder’s Journey There and Back Again. Part 3: The Graphing Conundrum https://dev.to/olgabraginskaya/the-golang-saga-a-coders-journey-there-and-back-again-part-3-the-graphing-conundrum-12h2 The Golang Saga A Coder s Journey There and Back Again Part The Graphing ConundrumWelcome back to the third part of “The Golang Saga A Coder s Journey There and Back Again In the first part of this series I started work on a personal project the Climate Change Visualizer using Go as my chosen programming language In the second part we delved into a data expedition exploring available data formats and filters in CDO Now it s time to take a deeper dive into visualization In this chapter I will focus on creating graphs in Go using the CDO data for Tel Aviv spanning from to We ll explore various techniques to illustrate climate change patterns effectively As a quick reminder in the previous article we ve selected the following charts to illustrate climate change patterns Scatter plot that illustrates how the minimum and maximum temperatures change throughout the years on average Heatmap visualizing yearly average temperatures Temperature anomaly graph that shows average temperature variations throughout years comparing them to the chosen baseline Climate change stripes that uses colored stripes to represent each year s average temperature Let s continue this coding adventure and explore the world of visualizations with Go All the code used in this article can be found in this repository Chapter where the main character discovers that the world of data is not perfectBefore diving into the visualization process let s revisit the data we downloaded for Tel Aviv As a quick reminder the CSV looks like this Additionally I copied a sample of data from another station in this CSV file As we recall from the second article we requested daily temperature records from four different stations However even from those few rows we can observe that some data is missing indicated by the empty values in the TAVG TMAX and TMIN columns This raises the possibility that there might be more missing values in the dataset To gain a better understanding of the missing data our next step is to create a graph that visualizes the overall situation I believe a grouped bar chart per station with the sum of empty values for TAVG TMAX and TMIN as the values would be a suitable choice for this purpose To build the plots in this article I ll be using the gonum plot library which provides a range of tools for generating different types of charts and graphs While I m new to gonum plot I believe it has the potential to effectively showcase temperature trends and patterns over time You can get gonum plot using the following command in your terminal go get gonum org v plot As a quick reminder in the previous article I created a Jupyter notebook with Go kernel where I read the CSV file and stored the data in the cache Now I can use the cached csv records variable in other cells of this notebook which allows easier data manipulation and visualization without having to reload the data each time Now my next task is to calculate the number of empty TAVG TMAX and TMIN values in the CSV file While working on the code I had the opportunity to explore and understand the map data type in Go Let s take a closer look at the code below As you can see I defined a data structure called StationData which represents information about a weather station including its name and the number of missing values for TAVG average temperature TMAX maximum temperature and TMIN minimum temperature data Then I created a function called count missing data which takes a two dimensional slice of strings data representing the weather records Inside this function I used a map stationMap to store the StationData objects for each unique station name Next I looped through each record in the CSV data and extracted the station name Then I counted the number of missing values for TAVG TMAX and TMIN in each record and updated the corresponding fields in the StationData object As a result I have stations counts variable in the cache showing the number of missing TAVG TMAX and TMIN values for each weather station ISE IS ISE ISM And with this map now we are ready to create a group bar chart for each station to find out which station is the best for each type of value I found a helpful tutorial on gonum plot so I m going to use plotter NewBarChart for my purposes In the code above I set up three groups groupA groupB and groupC to store the counts for each station s missing TAVG TMAX and TMIN values respectively Then I created three bar charts with different colors barsA barsB and barsC for the three groups of data I added those three bar charts to the plot p The last part of the code is responsible for rendering the plot as a PNG image and showing it directly in Jupyter Notebook It uses Gonum Plot s WriterTo method to generate the image and then displays it using the gonbui DisplayPNG function From the graph above we can see that station ISM has the best data availability for TAVG values red bar is almost invisible but not as good for TMAX and TMIN values On the other hand other stations show the opposite pattern with better data for TMAX green bar and TMIN blue bar but less for TAVG At that point I was focused solely on checking for empty values in the dataset However there might be additional challenges such as completely missing days of data To get an overall view of the situation I decided to create a cross table spanning from to This table displays the years on the left side and indicates the percentage of existing TAVG TMAX and TMIN values relative to the total number of days in each year days for leap years In this code below I created a data structure called StationYearData to store the count of existing TAVG TMAX and TMIN values as well as the total number of days for each year and station combination Then I calculated those numbers and stored them in the stationsData map which is organized by station and year Next I extracted all unique years and stations from the stationsData map to create the headers for the cross table I sorted the stations and years to ensure consistent order in the table and wrote the results to output csv file As you can see from the result table below our analysis revealed that stations IS and ISE have missing data for the entire range of recent years This suggests that these stations might have stopped taking measurements during that period We can conclude that the data from the ISM station is suitable for creating visualizations based on average temperature On the other hand the data from the ISE station can be used as a source for visualizations related to maximum and minimum temperatures Generally speaking in the future we will need to take into account that not all data may be available and we may need to find ways to fill in the missing values However for now we will leave it as it is Chapter where the main character becomes frustratedFinally let s have some fun We have our data in csv records variable in the cache let s install all Go modules for visualization go get gonum org v plotgo get gonum org v plot plottergo get gonum org v plot vggo get gonum org v plot plotutil Scatter plotFor scatter plot that illustrates how the minimum and maximum temperatures change throughout the years on average we need to calculate average minimum and average maximum temperature per year for a chosen station ISE I m going to use plotter NewScatter from the tutorial to draw the graph In the code below I created yearlySummary map where each key represents a year and the corresponding value is a TempSummary struct containing the total TMAX and TMIN values for that year as well as the count of recorded temperatures Then iterating over the yearlySummary map I calculated the average TMAX and TMIN temperatures by dividing the total TMAX and TMIN values by the count of recorded temperatures for that year For each year I set the X coordinate of the i th data point in scatterData to the calculated tmaxAvg and the Y coordinate to the calculated tminAvg Then plotter NewScatter uses this data to create the scatter plot Let s take a look at the scatter plot below Frankly it doesn t look very impressive To be honest finding information on visualizing graphs in the Go language is quite challenging and the data available to ChatGPT about gonum org v plot module is outdated which makes it difficult to create visually appealing graphs However luck was on my side as I stumbled upon an example of a colored scatter plot that looked quite nice Let s look at the new code In this code I specified the style and color for individual points in the scatter plot The function sc GlyphStyleFunc takes an index i as input and returns a draw GlyphStyle that defines the appearance of the point at that index To determine the color of the point the code calculates a normalized value d based on the z coordinate of the point z and the minimum minZ and maximum maxZ z values in the scatter data It then uses this normalized value to interpolate between the colors defined in the colors colormap The new chart looks much better and we can observe a warming trend over the years based on the colors of the dots HeatmapI couldn t find a heatmap function in plot module so I asked ChatGTP for a help You re not helping ChatGPT But then I found this article and by following the provided example I managed to create my own heatmap So what is a heatmap A heatmap of average temperatures is a graphical representation that uses colors to show the average temperature values across a geographical area or a specific grid Warmer colors such as red or orange typically represent higher average temperatures while cooler colors like blue or green represent lower average temperatures To create the graph we first need to obtain the average temperatures per day of the each year and then adjust the colors accordingly The code below defines a custom struct called plottable which holds the data needed to create the heatmap The WeatherData struct represents the weather data for a specific station including temperature measurements Tavg Tmax and Tmin and other information I read data from the csv records variable and filter it to get the temperature data for the ISM station The dataset variable is a two dimensional Go slice that holds the average temperature data for each day of the year rows across different years columns Then I find only non empty years columns and use moreland SmoothBlueRed function to create a color palette for the heatmap The result has empty lines due to the presence of empty or missing values in the CSV dataset as I mentioned earlier during data analysis In the future we may need to decide whether to ignore these gaps or implement techniques to fill in missing data such as using the nearest neighbor approach Despite the empty lines the graph clearly shows the expansion of the red funnel during the summer over the years from the bottom to the top This observation indicates a warming trend over time Temperature anomaly graphFor this graph we should start with defining and calculating baseline I chose years as a baseline period because Internet said that it s the most common choice in my case To calculate average temperature for the baseline period we need to add up the temperatures for each year within the selected time period and divide the total by the number of years This average will be used as the reference point for comparing temperatures in subsequent years To plot the graph we will calculate the temperature anomalies for each year by subtracting the baseline average temperature from the actual temperature for that year Positive anomalies indicate temperatures above the baseline while negative anomalies indicate temperatures below the baseline In the following code I iterated through the CSV records filtering and parsing the TAVG temperature data for the ISM station between and Within this process I calculated the sum of TAVG values and counted the number of valid records within the baseline period By dividing the baselineSum by the baselineCount I obtained the baseline value Then subsequently I calculated temperature anomalies for each TAVG value in the data slice starting from by subtracting the baseline value Finally I used the plotutil AddLinePoints function to create and visualize the graph The result is visually less appealing than I expected Despite my efforts I couldn t find a way to remove or resize those points which has negatively affected the graph s aesthetics As a result it doesn t look as clean and smooth as I had hoped However even with these issues one can still observe the overall trend of rising temperatures over the years Climate Change StripesThis graph consists of colored stripes representing average temperature data for each year Each stripe corresponds to a specific year and the color intensity shows how warm or cool that year was Warmer years are depicted with warmer colors like red while cooler years are shown with cooler colors like blue I started with the question to ChatGPT of whether it s possible to do it So Climate Change Stripes seems like a colored bar chart to me and I asked if it s possible to create it to which the answer is no By this point I had become quite frustrated with Go s limited graphical capabilities and the extensive code needed to produce a basic graph Consequently I made the decision to give up on trying to create this particular graph in Go While you can find the code responsible for calculating the graph s data in my repository I wasn t able to successfully generate the graph using Go s capabilities Chapter where the main character calls Python for helpTo put it simply I must admit that the graphs I created in Go didn t turn out well It seems that Go is not the best tool for making nice looking visualizations So let s look for another superhero that can handle visualization tasks more effectively and beautifully All the code mentioned below and more you can find in this Jupyter file in my repo For this chapter we need to install Python and a few libraries pip install pandaspip install seabornpip install matplotlibpip install numpyFirst we read csv into pandas dataframe Let s revisit the anomaly graph because I feel like I didn t prove that this graph shows any trend In the code below I replicated the same process of calculating the baseline by averaging the TAVG values for the period and then calculating the temperature anomalies by subtracting the baseline value from each TAVG value in the data slice Additionally I included two lines to visually represent the trend in anomalies allowing us to better observe any temperature changes over the years Now with the yellow and green lines we can clearly observe the warming trend in the temperature anomalies These lines help us better visualize any changes in temperature over the years comparing to the baseline period Well it s time to conquer climate change stripes graph In this code the data is filtered to focus on the ISM weather station Subsequently I grouped the data by year and created a list for each year containing the TAVG values By calculating the annual mean temperatures and normalizing them to the range I mapped the mean temperatures to colors The code produces a Climate Change Stripes effect by drawing stripes for each year where the color of each stripe represents the normalized mean temperature for that particular year As a result we finally witness a clear warming trend presented in a visually striking and beautiful chart The chart shows a significant increase in the red coloration after the year emphasizing the pronounced warming trend over the years Even in simple Python modules like Matplotlib the job is accomplished exceptionally well If we use a more sophisticated library for data visualization such as Plotly we can achieve even more advanced and interactive visualizations Let s see an example for our first scatter plot graph First we should to install Plotly library pip install plotlyWith the following code we can create a professionally looking scatter plot using plotly express scatter This scatter plot allows us to interact with the data by hovering the cursor over a point and getting its corresponding values of TMAX and TMIN Actually I even found a repo with wrapper functions for Plotly s HTTP API on Go It might be helpful in the future Chapter where the main character sees light at the end of the tunnelTaking into account my difficulties with using Go for graph visualization I consider the following structure for my Proof of Concept POC A robust database to store historical weather data from the CDO Climate Data Online source An intuitive frontend application that can construct climate graphs using the weather data An API application that will serve data from the database to the frontend app upon request A data transformation and aggregation app ETL that will fetch data from the CDO API process and organize it and finally load it into the database for easy access and retrieval by the frontend app For the POC I will set up a local environment using Docker containers We ll start with a container hosting Jupyter and facilitates the creation of climate graphs using Python and Plotly This will serve as our frontend application On the backend we ll develop an API application on Go another Docker container that will connect to our chosen database The API will be the bridge between the frontend and the database serving the weather data upon request and ensuring smooth data flow for the graph generation The final piece of the puzzle is an essential component ーthe data transformation and aggregation app ETL Implemented in Go this container will fetch data from the Climate Data Online CDO API process and organize it and efficiently load it into the database This ETL process will keep our database up to date with the latest weather data ensuring our visualizations are always current I m still deciding on the database but I m considering using PostgreSQL However I m open to suggestions and would love to hear your thoughts in the comments Thus this setup not only makes our system easy to run but also makes it portable ーit can be run on any machine with Docker installed without needing to worry about installing dependencies Additionally this architecture is flexible and future proof If we wanted to replace our Jupyter notebook with a different frontend like a React app we could do so without affecting the rest of the system Similarly if we wanted to switch to a different DBMS we could replace our container with a different database container update our Go applications to interact with the new database and leave the rest of the system untouched Despite the challenges I am generally satisfied with the progress I have made on this journey ConclusionTo sum up in this chapter we ve faced challenges and found solutions in visualizing climate data In addition we have outlined the key aspects of our Proof of Concept POC Stay tuned for further progress and new insights as we continue our adventure with Golang Previous articles The Golang Saga A Coder s Journey There and Back Again Part Leaving the ShireThe Golang Saga A Coder s Journey There and Back Again Part The Data Expedition 2023-08-16 17:06:20
海外TECH DEV Community Null and Undefined in JavaScript https://dev.to/islot/null-and-undefined-in-javascript-pof Null and Undefined in JavaScript IntroductionJavaScript has emerged as one of the leading programming languages particularly in the realm of software engineering and web development Despite its straightforward syntax and simplicity beginners often encounter challenges when grasping certain concepts within the language In this article we will delve into a fundamental aspect of JavaScript s data types Null and Undefined By the end of this discussion readers will gain a comprehensive understanding of Null and Undefined as well as insight into their differences Stay engaged to uncover the nuances of this intriguing topic Understanding Null and UndefinedTo embark on our exploration it s crucial to recognize that Null and Undefined are integral components of JavaScript s data types In order to fully immerse ourselves in this subject matter let us first establish clear definitions for these two key terms Null and Undefined DefinitionsNull null is a distinct value that signifies the deliberate absence of any object value It is frequently employed to indicate that a variable or property currently lacks a value or points to nothing When a variable is assigned the null value it indicates a lack of reference to any object or value within memory let foo null console log foo Output nullUndefined undefined is a primitive value that denotes the absence of a defined value for a given variable It is typically employed when a variable has been declared but remains unassigned or when a function fails to explicitly return any value let user console log user Output undefinedUpon examining these definitions it becomes apparent that while foo is assigned the value null the user variable lacks any assigned value This initial observation foreshadows the existence of disparities between these two data types Before delving into these distinctions let s first explore the shared features that unite Null and Undefined Similarities and Differences SimilaritiesBoth null and undefined in JavaScript serve as indicators of absent or meaningless values Several commonalities underscore the nature of these data types No Value Whether null or undefined both data types signify the absence of a valid value assigned to a variable Falsy Values Both null and undefined evaluate to false in boolean contexts aligning with their role as falsy values Primitive Data Types null and undefined are both classified as primitive data types within JavaScript Distinctions between Null and UndefinedHowever nuanced differences exist between these two data types highlighting their unique characteristics and use cases Type Assignment While null is explicitly designated by a programmer to convey the absence of value undefined often points to variables lacking assigned values let variable variable defaults to undefined let variable null variable is explicitly assigned nullAssignment Behavior undefined serves as the default value for uninitialized variables whereas null is intentionally assigned to represent the absence of value if variable console log variable is falsy Output variable is falsy if variable console log variable is falsy Output variable is falsy Usage undefined is commonly employed to denote uninitialized variables or unassigned properties In contrast null is often utilized to express a deliberate absence of value function getValue Note no return statement const result getValue console log result Output undefined const intentionalAbsence null console log intentionalAbsence Output nullBehavior When assessed for equality null and undefined are loosely equal but not strictly equal signifying the same value but distinct types console log variable variable Output true console log variable variable Output falsetypeof Operator Surprisingly the typeof operator yields an undefined result for an undefined variable while it produces an object result for a variable assigned the value null This idiosyncrasy is one of JavaScript s peculiarities deviating from the expectation that both should return the same value Define an undefined variable let undefinedVar Define a variable with a null value let nullVar null Use typeof operator on undefined variable let result typeof undefinedVar Use typeof operator on null variable let result typeof nullVar Output the results console log Result for undefinedVar result Should print undefined console log Result for nullVar result Should print object Arithmetic Operations In the realm of arithmetic operations undefined returns NaN Not a Number whereas null is converted to before participating in the operation Define an undefined variable let undefinedVar Define a null variable let nullVar null Perform arithmetic operations with undefined variable let result undefinedVar Results in NaN let result undefinedVar Results in NaN Perform arithmetic operations with null variable let result nullVar Converts null to result is let result nullVar Converts null to result is Output the results console log Result result Should print NaN console log Result result Should print NaN console log Result result Should print console log Result result Should print ConclusionIn conclusion an in depth comprehension of the Null and Undefined data types in JavaScript contributes to the creation of intelligible and error free code Throughout this article we have spotlighted their similarities unveiled their distinctions and explored real world applications for these vital data types To solidify your understanding engage in practical exercises that involve working 2023-08-16 17:03:40
海外TECH Engadget New York City bans TikTok for government employees https://www.engadget.com/new-york-city-bans-tiktok-for-government-employees-174806575.html?src=rss New York City bans TikTok for government employeesNew York City will ban TikTok from government devices The Verge reported on Wednesday City agencies have days to remove the ByteDance owned app from their devices Employees will not be allowed to download or use TikTok on their city sanctioned tech effective immediately This comes three years after New York state banned TikTok from government devices in according to Times Union NYC Cyber Command a subset of the Office of Technology and Innovation spurred the decision after reporting to the city that TikTok posed a security threat Other states and localities notably Montana have made waves banning TikTok more generally across the jurisdiction But on a wider scale most legislators have taken an approach banning the app for government employees including the federal government Thirty three states across parties lines now have restrictions on the use of TikTok on government owned tech As legislation continues to resurface considering a total ban on TikTok and other apps affiliated with the Chinese government ByteDance fights to proven that its not a threat to national security TikTok CEO Shou Chew even testified in front of Congress reiterating that quot ByteDance is not an agent of China quot The NYC Office of Technology and Innovation did not respond to a request for comment by the time of publication This article originally appeared on Engadget at 2023-08-16 17:48:06
海外TECH Engadget YouTube's NFL Sunday Ticket includes live chat and highlights in Shorts https://www.engadget.com/youtubes-nfl-sunday-ticket-includes-live-chat-and-highlights-in-shorts-170037161.html?src=rss YouTube x s NFL Sunday Ticket includes live chat and highlights in ShortsWho s ready for some bad opinions from the internet while you watch football YouTube has revealed some more NFL Sunday Ticket features for the upcoming season As you watch games you ll be able to view a live chat and read what other people think about a certain play or call Live chat and polls will be available on both mobile and TVs YouTube users will be able to watch real time NFL highlights on Shorts On Sunday afternoons these highlights will include a red Live ring around the channel s avatar and clicking on this will take users to the NFL channel s Live tab There Sunday Ticket subscribers can decide which game or games thanks to the multiview options to start watching One other thing that could be helpful for viewers is key plays a handy YouTube TV feature that the platform is bringing over to Sunday Ticket You ll be able to catch up on a game that you couldn t watch or check out big plays before joining the live action This feature will only be available on TVs this season which is the first under a multibillion dollar seven year pact that YouTube has with the NFL for Sunday Ticket rights Naturally YouTube is looking to recoup its investment on Sunday Ticket and it s now offering fans more ways to sign up Starting today there will be a monthly payment plan option for Sunday Ticket in most states to help fans spread the cost of a subscription over a longer period It may take a few days before the option is available on YouTube and YouTube TV in your area nbsp However the monthly plan won t be available to folks in Georgia New York Minnesota Nevada Missouri Tennessee or New Jersey Residents of those states will need to pay for a season long Sunday Ticket subscription which now starts at up front Meanwhile YouTube says student plans will be available sometime in the next week This article originally appeared on Engadget at 2023-08-16 17:00:37
海外科学 NYT > Science Pig Kidneys Performing Effectively in Two Brain-Dead Patients https://www.nytimes.com/2023/08/16/health/pig-kidney-organ-transplants.html Pig Kidneys Performing Effectively in Two Brain Dead PatientsIn two experiments researchers implanted the organs into brain dead patients for extended periods raising hopes for a new supply of donor organs 2023-08-16 17:06:41
ニュース BBC News - Home No plans for bank holiday if England win World Cup https://www.bbc.co.uk/news/uk-66524191?at_medium=RSS&at_campaign=KARANGA lionesses 2023-08-16 17:47:53
ニュース BBC News - Home British Museum worker sacked over missing items https://www.bbc.co.uk/news/uk-england-66527422?at_medium=RSS&at_campaign=KARANGA dismisses 2023-08-16 17:41:08
ニュース BBC News - Home PSNI data breach: Man arrested in investigation into linked criminality https://www.bbc.co.uk/news/uk-northern-ireland-66527832?at_medium=RSS&at_campaign=KARANGA criminalitythe 2023-08-16 17:49:13
ニュース BBC News - Home Andrew Malkinson: Calls for inquiry into wrongful rape conviction https://www.bbc.co.uk/news/uk-66524197?at_medium=RSS&at_campaign=KARANGA andrew 2023-08-16 17:37:25
ニュース BBC News - Home Bradley Cooper: Leonard Bernstein's family defend actor over Maestro nose row https://www.bbc.co.uk/news/entertainment-arts-66526446?at_medium=RSS&at_campaign=KARANGA jewish 2023-08-16 17:08:59
ニュース BBC News - Home Maui fire: First victims named as death toll reaches 106 https://www.bbc.co.uk/news/world-us-canada-66518502?at_medium=RSS&at_campaign=KARANGA buddy 2023-08-16 17:04:06
ビジネス ダイヤモンド・オンライン - 新着記事 米旅行者に「メキシコ疲れ」 コロナ禍で人気も - WSJ発 https://diamond.jp/articles/-/327747 疲れ 2023-08-17 02:20:00
Azure Azure の更新情報 Public Preview: Azure Elastic SAN Updates: Private endpoints & shared volumes https://azure.microsoft.com/ja-jp/updates/azure-elastic-san-updates-private-endpoints-shared-volumes/ Public Preview Azure Elastic SAN Updates Private endpoints amp shared volumesIntroducing the latest updates to Azure Elastic SAN in preview private endpoint support and volume sharing support via SCSI Small Computer System Interface Persistent Reservation 2023-08-16 18:00:07

コメント

このブログの人気の投稿

投稿時間:2021-06-17 22:08:45 RSSフィード2021-06-17 22:00 分まとめ(2089件)

投稿時間:2021-06-20 02:06:12 RSSフィード2021-06-20 02:00 分まとめ(3871件)

投稿時間:2021-06-17 05:05:34 RSSフィード2021-06-17 05:00 分まとめ(1274件)