IT |
気になる、記になる… |
Amazon、Kindleストアで講談社の「夏☆電書2021」の各種セールを開始 |
https://taisy0.com/2021/06/25/142286.html
|
amazon |
2021-06-24 16:35:44 |
TECH |
Engadget Japanese |
Win 11ではAndroidアプリがウィンドウで動作。Amazonアプリストアと提携 |
https://japanese.engadget.com/windows11-android-apps-ms-store-amazon-164516678.html
|
amazon |
2021-06-24 16:45:16 |
AWS |
AWS Big Data Blog |
Increase Amazon Elasticsearch Service performance by upgrading to Graviton2 |
https://aws.amazon.com/blogs/big-data/increase-amazon-elasticsearch-service-performance-by-upgrading-to-graviton2/
|
Increase Amazon Elasticsearch Service performance by upgrading to GravitonAmazon Elasticsearch Service Amazon ES supports multiple instance types based on your use case In AWS announced general purpose Mg compute optimized Cg and memory optimized Rg Rgd instance types for Amazon ES version or later powered by AWS Graviton processors which delivers a major leap in capabilities and better price performance improvement over … |
2021-06-24 16:45:05 |
AWS |
AWS |
Amazon Aurora Global Database - Managed Planned Failover for Cross Region DR and Data Locality |
https://www.youtube.com/watch?v=yaOnBLmnTcI
|
Amazon Aurora Global Database Managed Planned Failover for Cross Region DR and Data LocalityLearn how to use Amazon Aurora Global Database for Cross Region Disaster Recovery and improve read performance of globally available applications with a click of a button Learn more about Amazon Aurora Subscribe More AWS videos More AWS events videos AWS AWSDemos AmazonAurora |
2021-06-24 16:48:44 |
AWS |
AWS |
Introduction to AWS Service Catalog |
https://www.youtube.com/watch?v=A6-jv3gZa4U
|
Introduction to AWS Service CatalogIn this video you ll get an introduction to AWS Service Catalog With this service you can create organize and govern a curated catalog of AWS products that can be shared by permissions level so end users can quickly provision approved IT resources without needing direct access to the underlying AWS services For more information on this topic please visit the resources below Service CatalogGitHubSubscribe More AWS videos More AWS events videos AWS AWSDemos |
2021-06-24 16:45:40 |
AWS |
AWS |
Enable AWS Control Tower for Existing Organizations |
https://www.youtube.com/watch?v=CwRy0t8nfgM
|
Enable AWS Control Tower for Existing OrganizationsIn this video you ll see how to enable AWS Control Tower for existing organizations With AWS Control Tower you can govern and manage your organization s accounts apply security guardrails across accounts and monitor compliance for new and existing accounts For more information on this topic please visit these resources Learn more about AWSControlTower AWSControlTower User Guide Management Governance Blog on AWSControlTower Subscribe More AWS videos More AWS events videos AWS AWSDemo |
2021-06-24 16:44:05 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
express+mysqlで絞り込み検索機能を実装したい |
https://teratail.com/questions/345956?rss=all
|
調べた結果、入力内容をpostしてresultejsに結果を表示する方法が適切だと思い、ejsrouterを書いてみました。 |
2021-06-25 01:49:13 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
pythonでの乱数生成 |
https://teratail.com/questions/345955?rss=all
|
nbspnbspnbspnbspnbsp |
2021-06-25 01:49:10 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
キュー エラーが出てしまう |
https://teratail.com/questions/345954?rss=all
|
キューエラーが出てしまう下のようなプログラムを書いたのですが、これを実行した結果、ExceptionnbspinnbspthreadnbspquotmainquotjavalangArrayIndexOutOfBoundsExceptionnbspIndexnbspnbspoutnbspofnbspboundsnbspfornbsplengthnbspというようなエラーが出てしまいます。 |
2021-06-25 01:35:49 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
タブ切り替え メニューボタンからタブメニューをアクティブにしたい |
https://teratail.com/questions/345953?rss=all
|
タブ切り替えメニューボタンからタブメニューをアクティブにしたい前提・実現したいことJqueryにてタブ切り替えのページを作成しています。 |
2021-06-25 01:32:58 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
Jump(); // ジャンプ |
https://teratail.com/questions/345952?rss=all
|
Jumpジャンプ前提・実現したいことUnity初心者です。 |
2021-06-25 01:18:40 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
unsupported operand type(s) for +: 'int' and 'NoneType'について |
https://teratail.com/questions/345951?rss=all
|
unsupportedoperandtypesforxintxandxNoneTypexについて前提・実現したいことここに質問の内容を詳しく書いてください。 |
2021-06-25 01:17:43 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
リストに追加するときの条件処理とその後の追加の仕方が分かりません |
https://teratail.com/questions/345950?rss=all
|
リストに追加するときの条件処理とその後の追加の仕方が分かりません前提・実現したいこと点線より上の処理の処理をリストを用いてシンプルにしたいと考えています。 |
2021-06-25 01:16:25 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
Android Studioでmediapipeをbazelでビルドするとエラーが発生します。 |
https://teratail.com/questions/345949?rss=all
|
AndroidStudioでmediapipeをbazelでビルドするとエラーが発生します。 |
2021-06-25 01:04:28 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
No route matches [GET] の原因を解決したいです。 |
https://teratail.com/questions/345948?rss=all
|
NoroutematchesGETの原因を解決したいです。 |
2021-06-25 01:00:58 |
Linux |
Ubuntuタグが付けられた新着投稿 - Qiita |
VSCodeとGitの連携方法 |
https://qiita.com/nutsOst/items/82706a7160305978bac3
|
|
2021-06-25 01:16:11 |
Git |
Gitタグが付けられた新着投稿 - Qiita |
VSCodeとGitの連携方法 |
https://qiita.com/nutsOst/items/82706a7160305978bac3
|
|
2021-06-25 01:16:11 |
Ruby |
Railsタグが付けられた新着投稿 - Qiita |
localhost以外のポートで接続する方法[Rails] Blocked host |
https://qiita.com/Ub_Iwerks/items/35ec63c021401269bf98
|
DNSリバインディング攻撃防止の目的で、許可されていないドメイン名からのアクセスを拒否する機能です。 |
2021-06-25 01:58:18 |
海外TECH |
DEV Community |
Instrumenting Your Node.js Apps with OpenTelemetry |
https://dev.to/newrelic/instrumenting-your-node-js-apps-with-opentelemetry-5flb
|
Instrumenting Your Node js Apps with OpenTelemetryAs systems become increasingly complex it s increasingly important to get visibility into the inner workings of systems to increase performance and reliability Distributed tracing shows how each request passes through the application giving developers context to resolve incidents showing what parts of their system are slow or broken A single trace shows the path a request makes from the browser or mobile device down to the database By looking at traces as a whole developers can quickly discover which parts of their application is having the biggest impact on performance as it affects your users experiences That s pretty abstract right So let s zero in on a specific example to help clarify things We ll use OpenTelemetry to generate and view traces from a small sample application Spinning up our Movies AppWe have written a simple application consisting of two microservices movies and dashboard The movies service provides the name of movies and their genre in JSON format while the dashboard service returns the results from the movies service Clone the repoTo spin up the app run npm i node dashboard js node movies jsNotice the variable delay built into the movies microservice that causes random delays returning the JSON const express require express const app express const port app get movies async function req res res type json var delay Math floor Math random setTimeout gt res send movies name Jaws genre Thriller name Annie genre Family name Jurassic Park genre Action delay Tracing HTTP Requests with Open TelemetryOpenTelemetry traces incoming and outgoing HTTP requests by attaching IDs To do this we need to Instantiate a trace provider to get data flowing Configure that trace provider with an exporter to send telemetry data to another system where you can view store and analyze it Install OpenTelemetry plugins to instrument specific node module s to automatically instrument various frameworks Step Create our trace provider and configuring it with an exporterWe ll start by creating our trace provider and configuring it with an exporter To do this we ll need to install npm install opentelemetry node OpenTelemetry auto instrumentation package for NodeJSThe opentelemetry node module provides auto instrumentation for Node js applications which automatically identifies frameworks Express common protocols HTTP databases and other libraries within your application This module uses other community contributed plugins to automatically instrument your application to automatically produce spans and provide end to end tracing with just a few lines of code OpenTelemetry Plugins npm install opentelemetry plugin http npm install opentelemetry plugin expressThe opentelemetry plugin http plugin generates trace data from NodeJS s underlying HTTP handling APIs that both send and handle requests The opentelemetry plugin express plugin generates trace data from requests sent through the express framework Step Adding the Trace Provider and the Span ProcessorAdd this code snippet to create a trace provideradds a span processor to the trace providerThis code gets data out of your local application and exports into your console const NodeTracerProvider require opentelemetry node const ConsoleSpanExporter SimpleSpanProcessor require opentelemetry tracing const provider new NodeTracerProvider const consoleExporter new ConsoleSpanExporter const spanProcessor new SimpleSpanProcessor consoleExporter provider addSpanProcessor spanProcessor provider register Once we add this code snippet whenever we reload http localhost dashboard we should get something like this beautiful things on the console Step a Spinning up ZipkinLet s spin up a Zipkin instance with the Docker Hub Image docker run d p openzipkin zipkinand you ll have a Zipkin instance up and running You ll be able to load it by pointing your web browser to http localhost You ll see something like this Step Exporting to ZipkinWhile it s neat spans in a terminal window are a poor way to have visibility into a service In our code above the following lines are what added a console exporter to our system Let s now ship this data to Zipkin In this code snippet we are instantiating a Zipkin exporter and then adding it to the trace provider The great thing about OpenTelemetry is that it s backend agnostic meaning you can have as many different exporters configured as you like const NodeTracerProvider require opentelemetry node const ConsoleSpanExporter SimpleSpanProcessor require opentelemetry tracing const ZipkinExporter require opentelemetry exporter zipkin const provider new NodeTracerProvider const consoleExporter new ConsoleSpanExporter const spanProcessor new SimpleSpanProcessor consoleExporter provider addSpanProcessor spanProcessor provider register const zipkinExporter new ZipkinExporter url http localhost api v spans serviceName movies service const zipkinProcessor new SimpleSpanProcessor zipkinExporter provider addSpanProcessor zipkinProcessor After you make these changes let s visit our Zipkin instance at localhost start our application back up and request some URLs Step Using the OpenTelemetry Collector to export the data into New RelicWhat happens if we want to send the OpenTelemetry data to another backend where you didn t have to manage all of your own telemetry data Well the amazing contributors to OpenTelemetry have come up with a solution to fix this The OpenTelemetry Collector is a way for developers to receive process and export telemetry data to multiple backends It supports multiple open source observability data formats like Zipkin Jaeger Prometheus Fluent Bit sending it to one or more open source or commercial back ends New RelicNew Relic is a platform for you to analyze store and use your telemetry data for Free forever Sign up now Configuring the OpenTelemetry Collector with New RelicClone the OpenTelemetry Collector with New Relic Exporter and spin up the docker container making sure to export the New Relic API key export NEW RELIC API KEY lt INSERT API KEY HERE gt docker compose f docker compose yaml upMake sure to change the reporting URL from http localhost api v spans to http localhost in both dashboard js and movies jsconst zipkinExporter new ZipkinExporter url http localhost api v spans url http localhost serviceName movies service Step Look at your beautiful data Navigate to the Explorer tab on New Relic One When you click on the service you should be able to see some beautifultraces Final ThoughtsInstrumenting your app with Open Telemetry makes it easy to figure out what is going wrong when parts of your application is slow broken or both With the collector you can forward your data anywhere so you are never locked into a vendor You can choose to spin up an open source backend use a proprietary backend like New Relic or just roll your own backend Whatever you choose I wish you well you in journey to instrument EVERYTHING |
2021-06-24 16:21:26 |
海外TECH |
DEV Community |
Image upload using Golang and React |
https://dev.to/harshmangalam/image-upload-using-golang-and-react-29n1
|
Image upload using Golang and ReactGolang is a blockbuster server side language in the field of efficiency and concurrency If you are a Nodejs developer definitely you will come across express js for building your web api services Gofiber is exactly like the express framework for golang and no doubt it booms with the efficiency of Fasthttp and golang In this blog post we will create a simple image upload server using gofiber and we will use reactjs for frontend to select image from file and upload to server we will use axios for http request to server and it is really awesome when we deal with implementing authentication and handling lots of api requests It has lots of features which make life easy when dealing with api in react we will use chakra ui for designing material like button images and layout it shins in Accessibility that directly effect better SEO library and tools we will usegolanggofiberreactjsaxioschakra ui Setup backendcreate new directory and enter into itmkdir go react image uploadcd go react image uploadcreate a new directory server inside go react image upload and enter into itmkdir server cd serverSetup go environmentgo mod init github com harshmangalaminstall packages required for backendgo get github com gofiber fiber vgo get github com google uuiduuid will help to generate unique id so that we can name our image easily and no two image will have same name create new go file main go inside server and start writting codepackage mainimport fmt log os strings github com gofiber fiber v github com gofiber fiber v middleware cors github com google uuid func main create new fiber instance and use across whole app app fiber New middleware to allow all clients to communicate using http and allow cors app Use cors New serve images from images directory prefixed with images i e http localhost images someimage webp app Static images images handle image uploading using post request app Post handleFileupload delete uploaded image by providing unique image name app Delete imageName handleDeleteImage start dev server on port log Fatal app Listen func handleFileupload c fiber Ctx error parse incomming image file file err c FormFile image if err nil log Println image upload error gt err return c JSON fiber Map status message Server error data nil generate new uuid for image name uniqueId uuid New remove from imageName filename strings Replace uniqueId String extract image extension from original file filename fileExt strings Split file Filename generate image from filename and extension image fmt Sprintf s s filename fileExt save image to images dir err c SaveFile file fmt Sprintf images s image if err nil log Println image save error gt err return c JSON fiber Map status message Server error data nil generate image url to serve to client using CDN imageUrl fmt Sprintf http localhost images s image create meta data and send to client data map string interface imageName image imageUrl imageUrl header file Header size file Size return c JSON fiber Map status message Image uploaded successfully data data func handleDeleteImage c fiber Ctx error extract image name from params imageName c Params imageName delete image from images err os Remove fmt Sprintf images s imageName if err nil log Println err return c JSON fiber Map status message Server Error data nil return c JSON fiber Map status message Image deleted successfully data nil run main go from servergo run main goNow our server is up and running we can test it using Postman setup frontendcome outside from server directory and generate reactjs project using create react appnpx create react app reactjscd reactjsinstall dependenciesnpm i chakra ui react emotion react emotion styled framer motion axiosindex jsimport React from react import ReactDOM from react dom import App from App ReactDOM render lt App gt document getElementById root setup App jsimport Box ChakraProvider Container from chakra ui react import Axios from axios import Upload from components Upload Axios defaults baseURL http localhost function App return lt ChakraProvider gt lt Box minH vh w bg gray display flex alignItems center justifyContent center gt lt Container maxWidth container xl gt lt Upload gt lt Container gt lt Box gt lt ChakraProvider gt export default App create new hook useUpload hook in hooks folderhooks useUpload jsimport useState from react import axios from axios import useToast from chakra ui react const useUpload gt const image setImage useState null const loading setLoading useState false const uploadedImage setUploadedImage useState null const toast useToast const handleChangeImage e gt setImage e target files const handleUploadImage async gt try setLoading true const formData new FormData formData append image image const res await axios post formData if res data data console log res data setUploadedImage res data data toast title Image Uploaded description res data message status success duration isClosable true catch error console log error finally setImage null setLoading false const handleRemoveImage async gt try setLoading true const res await axios delete uploadedImage imageName if res data console log res data setUploadedImage null toast title Image Deleted description res data message status success duration isClosable true catch error console log error finally setLoading false return image uploadedImage loading handleChangeImage handleUploadImage handleRemoveImage export default useUpload create Upload js inside components foldercomponents Upload jsimport Button Heading VStack Image HStack Tag from chakra ui react import React from react import useRef from react import useUpload from hooks useUpload function Upload const imageRef useRef null const loading image handleRemoveImage handleChangeImage handleUploadImage uploadedImage useUpload return lt gt lt input style display none type file accept image ref imageRef onChange handleChangeImage gt lt VStack gt lt Heading gt Image uploading using Golang and Reactjs lt Heading gt lt Button onClick gt imageRef current click colorScheme blue size lg gt Select Image lt Button gt lt VStack gt image amp amp lt VStack my gt lt Image src URL createObjectURL image width px height px alt selected image gt lt Button onClick handleUploadImage variant outline colorScheme green isLoading loading gt Upload lt Button gt lt VStack gt uploadedImage amp amp lt VStack my gt lt Image src uploadedImage imageUrl width px height px alt uploadedImage imageName gt lt HStack gt lt Tag variant outline colorScheme blackAlpha gt Math floor uploadedImage size Kb lt Tag gt lt Button variant solid colorScheme red onClick handleRemoveImage isLoading loading gt Delete lt Button gt lt HStack gt lt VStack gt lt gt export default Upload |
2021-06-24 16:19:29 |
海外TECH |
DEV Community |
Introducing the Core Web Vitals Technology Report |
https://dev.to/httparchive/introducing-the-core-web-vitals-technology-report-4pep
|
Introducing the Core Web Vitals Technology ReportThe technologies you use to build your website can have an effect on your ability to deliver good user experiences Good UX is key to performing well with Core Web Vitals CWV a topic which is probably top of mind for you as it is for many other web developers now that these metrics play a role in Google Search ranking While web developers have had tools like Search Console and PageSpeed Insights to get data on how their sites are performing the web community has been lacking a tool that has operated at the macro level giving us something more like WebSpeed Insights By combining the powers of real user experiences in the Chrome UX Report CrUX dataset with web technology detections in HTTP Archive we can get a glimpse into how architectural decisions like choices of CMS platform or JavaScript framework play a role in sites CWV performance The merger of these datasets is a dashboard called the Core Web Vitals Technology Report This dashboard was developed for the web community to have a shared source of truth for the way websites are both built and experienced For example the CWV Technology Report can tell you what percentage of websites built with WordPress pass the CWV assessment While a number like this on its own is interesting what s more useful is the ability to track this over time and compare it to other CMSs And that s exactly what the dashboard offers it s an interactive way to view how websites perform broken down by nearly technologies This post is a show and tell First I d like to walk you through the dashboard and show you how to use it then I ll tell you more about the data methodology behind it Using the dashboardThere are three pages in the dashboard Technology drilldownTechnology comparisonSettingsThe drilldown page lets you see how desktop and mobile experiences change over time for a single technology The default metric is the percent of origins having good CWV and it also supports individual CWV metrics see the Optional metrics section below The comparison page lets you compare desktop OR mobile experiences for any number of technologies over time Similar to the drilldown page you can select overall CWV compliance or individual CWV metrics Additionally this page supports visualizing the number of origins per technology The settings page is where you can configure report level preferences There are currently two settings categories and number of origins Refer to Wappalyzer for the list of possible categories Use this setting to limit the related technologies in the dropdown list You can also restrict the technologies to those with a minimum level of adoption for example those used by at least websites This can be helpful to reduce noisiness By default the CWV Technology Report is configured to drill down into WordPress performance and compare WordPress Wix and Squarespace This is to demonstrate the kinds of insights that are possible out of the box without having to know how to configure the dashboard yourself The full URL for the vanilla version of the dashboard is Optional metricsYou can also use the optional metrics feature of Data Studio to customize the dashboard and select specific CWV stats in the charts tables as needed The icon that looks like a chart with a gear icon is the button to select optional metrics In the timeseries chart you can toggle between the percent of origins having good CWV overall or specifically those with good LCP FID or CLS On the table views you can use this feature to add or remove columns for example to see all CWV metrics separately or to focus on just one Data Studio also enables you to share deep links into the dashboard for specific configurations For example here s a leaderboard of the top most popular CMSs ordered by CWV performance as of May DateTechnologyOriginsPercent good CWVMay C Bitrix May TYPO CMS May Drupal May Zendesk May Weebly May Squarespace May Joomla May Wix May Adobe Experience Manager May WordPress Here are some other configurations to help you explore the data Leaderboard of all technologies that are in the CMS or Blogs categories having more than originsComparison of all JavaScript frameworks and libraries with jQuery in their nameA year to date drilldown into React CWV performance Feature roadmapThere are two features missing from the dashboard that I would love to add in the near future segmenting by CrUX rank magnitude and comparing Lighthouse audit compliance Origin popularity would be a really interesting way to slice the data and the rank magnitude dimension would enable us to see how technology adoption and CWV performance change at the head torso and tail of the web Adding data from Lighthouse would enable us to get some clues into why a particular technology may be better or worse with CWV For example if a group of websites tend to have poor LCP performance it d be interesting to see what loading performance audits they also tend to fail Of course there are so many variables at play and we can t determine cause and effect but these results could give us something to think about for further exploration MethodologyThe CWV Technology Report is a combination of two data sources CrUX and HTTP Archive They are similar datasets in that they measure millions of websites but they have their own strengths and weaknesses worth exploring CrUX is a field tool meaning that it measures real user experiences It s also a public dataset so you could see how users experience any one of over million websites This is really cool to put it loosely because we as a community have visibility into how the web as a whole is being experienced The CrUX dataset is powered by Chrome users who enable usage statistics reporting Their experiences on publicly discoverable websites are aggregated together over day windows and the results are published in queryable monthly data dumps on BigQuery and via the CrUX API updated daily CrUX measures users experiences for each of the CWV metrics LCP FID and CLS Using this data we can evaluate whether the website passes the CWV assessment if percent of experiences for each metric are at least as good as thresholds set by the Web Vitals program HTTP Archive is a lab tool meaning that it measures how individual web pages are built Like CrUX it s a public dataset and it s actually based on the same websites in the CrUX corpus so we have perfect parity when combining the two sources together HTTP Archive is powered by WebPageTest which integrates with other lab tools like Lighthouse and Wappalyzer to extract fine grained data about the page Lighthouse runs audits against the page to determine how well optimized it is for example if it takes advantage of web performance best practices Wappalyzer is an open source tool that detects the use of technologies like an entire CMS a specific JavaScript library and even what programming languages are probably used on the backend These detections are what we use in the CWV Technology Report to segment the real user experience data from CrUX Confession time This isn t the first tool to look at CrUX data through the lens of how websites are built Perf Track is a report built by Houssein Djirdeh that slices CrUX data by JavaScript frameworks The annual CMS chapter of the Web Almanac slices CrUX data by you guessed it CMSs What makes the CWV Technology Dashboard different is that it facilitates exploration of the data by making all technologies across categories discoverable in a single browseable UI You can choose your own adventure by filtering technologies to a single category like Ecommerce and comparing platforms head to head to see which has more websites passing the CWV assessment The CrUX dataset on BigQuery is aggregated at the origin level An origin is a way to identify an entire website For example is the origin for the HTTP Archive website and it s different from which is a separate origin for the Web Almanac website HTTP Archive measures individual web pages not entire websites And due to capacity limitations HTTP Archive is limited to testing one page per website The most natural page to test for a given website is its home page or the root page of the origin For example the home root page of the HTTP Archive website is note the trailing slash This introduces an important assumption that we make in the CWV Technology Dashboard an entire website s real user experiences are attributed to the technologies detected only on its home page It s entirely possible that many websites we test use different technologies on their interior pages and some technologies may even be more or less likely to be used on home pages These biases are worth acknowledging in the methodology for full transparency but to be honest there s not a lot we at HTTP Archive can do to mitigate them without becoming a full blown web crawler Core Web VitalsThere may be different approaches to measure how well a website or group of websites performs with CWV The approach used by this dashboard is designed to most closely match the CWV assessment in PageSpeed Insights CWV metrics and thresholds may change annually but we ll do our best to keep the dashboard in sync with the state of the art Each individual CWV metric has a threshold below which user experiences are considered good For example LCP experiences under seconds are good A website must have at least of its LCP experiences in the good category to be considered as having good LCP overall If all of the CWV metrics are good the website is said to pass the CWV assessment Refer to the official CWV documentation for the latest guidance on the set of metrics and thresholds FID is an exception worth mentioning Because it relies on user input to be measured it doesn t occur on as many page loads as metrics like LCP and CLS That makes it less likely to have sufficient data for pages that may not have many interactive UI elements or websites with low popularity So the CWV Technology Dashboard replicates the behavior in PageSpeed Insights and assesses a website s CWV even in the absence of FID data In that case if LCP and CLS are good the website passes otherwise it doesn t In the rare case that a website is missing LCP or CLS data it s not eligible to be assessed at all When evaluating a group of origins like those in the dashboard that all use the same technology we quantify them in terms of the percentage of origins that pass the CWV assessment This is not to be confused with the percentage of users or the percentage of experiences Origins are aggregated in CrUX in a way that doesn t make it meaningful to combine their distributions together So instead we count origins as a unit those that use jQuery pass the CWV assessment have sufficient FID data have good LCP etc The CrUX dataset includes a form factor dimension representing the type of device the user was on We segment all of the data in the dashboard by this dimension and call it the Client with values of either desktop or mobile Querying the raw dataThe dashboard is implemented in Data Studio with a BigQuery connector to power all of the technology and CWV insights The underlying table on BigQuery is made publicly available at httparchive core web vitals technologies Feel free to query this table directly to extract information about specific technology trends or even to build your own custom dashboards or visualizations For reference this is the query that generated the core web vitals technologies table CREATE TEMP FUNCTION IS GOOD good FLOAT needs improvement FLOAT poor FLOAT RETURNS BOOL AS good good needs improvement poor gt CREATE TEMP FUNCTION IS NON ZERO good FLOAT needs improvement FLOAT poor FLOAT RETURNS BOOL AS good needs improvement poor gt WITH unique categories AS SELECT ARRAY AGG DISTINCT LOWER category AS categoriesFROM httparchive technologies mobile SELECT date ARRAY TO STRING ARRAY AGG DISTINCT category IGNORE NULLS ORDER BY category AS categories app client COUNT DISTINCT url AS origins COUNT DISTINCT IF good fid url NULL AS origins with good fid COUNT DISTINCT IF good cls url NULL AS origins with good cls COUNT DISTINCT IF good lcp url NULL AS origins with good lcp COUNT DISTINCT IF any fid url NULL AS origins with any fid COUNT DISTINCT IF any cls url NULL AS origins with any cls COUNT DISTINCT IF any lcp url NULL AS origins with any lcp COUNT DISTINCT IF good cwv url NULL AS origins with good cwv COUNT DISTINCT IF any lcp AND any cls url NULL AS origins eligible for cwv SAFE DIVIDE COUNTIF good cwv COUNTIF any lcp AND any cls AS pct eligible origins with good cwvFROM SELECT date CONCAT origin AS url IF device desktop desktop mobile AS client IS NON ZERO fast fid avg fid slow fid AS any fid IS GOOD fast fid avg fid slow fid AS good fid IS NON ZERO small cls medium cls large cls AS any cls IS GOOD small cls medium cls large cls AS good cls IS NON ZERO fast lcp avg lcp slow lcp AS any lcp IS GOOD fast lcp avg lcp slow lcp AS good lcp IS GOOD fast fid avg fid slow fid OR fast fid IS NULL AND IS GOOD small cls medium cls large cls AND IS GOOD fast lcp avg lcp slow lcp AS good cwv FROM chrome ux report materialized device summary WHERE date gt JOIN SELECT DISTINCT CAST REGEXP REPLACE TABLE SUFFIX r d d d r AS DATE AS date IF category AND LOWER category IN UNNEST SELECT categories FROM unique categories category NULL AS category app IF ENDS WITH TABLE SUFFIX desktop desktop mobile AS client url FROM httparchive technologies WHERE app IS NOT NULL AND app USING date url client GROUP BY date app clientThe most idealistic goal for this dashboard is to empower influencers in the web community to make improvements to swaths of websites at scale Web transparency projects like this one are meant to inform and inspire whether that s instilling a sense of competitiveness with other related technologies to climb the leaderboard or giving them actionable data to make meaningful improvements to technologies under their control Please leave a comment if you have any suggestions to help make the CWV Technology Report better |
2021-06-24 16:02:56 |
Apple |
AppleInsider - Frontpage News |
House Speaker Nancy Pelosi told Tim Cook to let antitrust bills play out |
https://appleinsider.com/articles/21/06/24/house-speaker-nancy-pelosi-told-tim-cook-to-let-antitrust-bills-play-out?utm_medium=rss
|
House Speaker Nancy Pelosi told Tim Cook to let antitrust bills play outU S House Speaker Nancy Pelosi confirmed that she spoke with Apple CEO Tim Cook about a recent slate of antitrust bills adding that she told him to let the process play out Credit J Scott Applewhite APOn Thursday the House Speaker detailed her conversation with Cook but maintained her stance that American privacy and data are at the ends of giant technology companies Pelosi said she told Cook to let the legislative process continue CNBC reported Read more |
2021-06-24 16:44:10 |
Apple |
AppleInsider - Frontpage News |
Microsoft Windows 11 revealed with dramatic increase in system requirements |
https://appleinsider.com/articles/21/06/24/microsoft-windows-11-revealed-with-dramatic-increase-in-system-requirements?utm_medium=rss
|
Microsoft Windows revealed with dramatic increase in system requirementsMicrosoft has officially debuted Windows with an all new user interface the ability to run Android apps from the Amazon App store and more Windows has been announcedWindows has a focus on productivity social and entertainment features The new design places the Start menu in the center of the dock and focuses on easy multitasking Read more |
2021-06-24 16:46:41 |
海外TECH |
Engadget |
Instagram tests posting photos from your desktop |
https://www.engadget.com/instagram-tests-desktop-posting-162209909.html?src=rss_b2c
|
browsers |
2021-06-24 16:22:09 |
海外TECH |
CodeProject Latest Articles |
StructMapping - Mapping JSON to and from a C++ Structure |
https://www.codeproject.com/Articles/5270863/StructMapping-Mapping-JSON-to-and-from-a-Cplusplus
|
StructMapping Mapping JSON to and from a C StructureI would like to define a C structure pass the person instance to the mapping method along with JSON data then use the filled structure Or vice versa get Person as JSON StructMapping is trying to solve these problems |
2021-06-24 16:58:00 |
Linux |
OMG! Ubuntu! |
How to Install Visual Studio Code on Ubuntu 20.04 |
http://feedproxy.google.com/~r/d0od/~3/rLX5KYuq6ew/how-to-install-visual-studio-code-on-ubuntu-20-04
|
How to Install Visual Studio Code on Ubuntu Microsoft s Visual Studio Code is a powerful and popular open source code editor In this post I show you how to install VSCode on Ubuntu LTS and above Visual Studio Code is available for This post How to Install Visual Studio Code on Ubuntu is from OMG Ubuntu Do not reproduce elsewhere without permission |
2021-06-24 16:44:00 |
海外科学 |
NYT > Science |
Biden Administration Backs Oil Sands Pipeline Project |
https://www.nytimes.com/2021/06/24/climate/line-3-pipeline-biden.html
|
Biden Administration Backs Oil Sands Pipeline ProjectThe administration urged a court to throw out a challenge brought by tribal and environmental groups backing a pipeline that would carry Canadian oil across Minnesota and Wisconsin |
2021-06-24 16:14:59 |
金融 |
金融庁ホームページ |
第46回金融審議会総会・第34回金融分科会合同会合議事次第について公表しました。 |
https://www.fsa.go.jp/singi/singi_kinyu/soukai/siryou/2021_0625.html
|
金融審議会 |
2021-06-24 17:00:00 |
ニュース |
BBC News - Home |
Covid-19: Europe braces for surge in Delta variant |
https://www.bbc.co.uk/news/world-europe-57594954
|
august |
2021-06-24 16:37:42 |
ニュース |
BBC News - Home |
UK deaths outnumber births for first time in 40 years |
https://www.bbc.co.uk/news/uk-57600757
|
birth |
2021-06-24 16:31:42 |
ニュース |
BBC News - Home |
Canada: 751 unmarked graves found at residential school |
https://www.bbc.co.uk/news/world-us-canada-57592243
|
saskatchewan |
2021-06-24 16:13:55 |
ニュース |
BBC News - Home |
Rudy Giuliani has New York law licence suspended |
https://www.bbc.co.uk/news/world-us-canada-57597551
|
allegations |
2021-06-24 16:33:57 |
ニュース |
BBC News - Home |
Poundland says 10% of its products are not a pound |
https://www.bbc.co.uk/news/business-57601580
|
discount |
2021-06-24 16:14:30 |
ニュース |
BBC News - Home |
Euro 2020: Last-16 tie against Germany 'very tough test' for England, says Jordan Henderson |
https://www.bbc.co.uk/sport/football/57600479
|
Euro Last tie against Germany x very tough test x for England says Jordan HendersonEngland midfielder Jordan Henderson says facing Germany in the European Championship last will be a special game and is an exciting prospect |
2021-06-24 16:18:58 |
ニュース |
BBC News - Home |
Injured US Open champion Thiem to miss Wimbledon |
https://www.bbc.co.uk/sport/tennis/57600746
|
injury |
2021-06-24 16:37:26 |
ニュース |
BBC News - Home |
Covid-19 in the UK: How many coronavirus cases are there in my area? |
https://www.bbc.co.uk/news/uk-51768274
|
cases |
2021-06-24 16:06:46 |
ニュース |
BBC News - Home |
Covid vaccine: How many people in the UK have been vaccinated so far? |
https://www.bbc.co.uk/news/health-55274833
|
covid |
2021-06-24 16:15:45 |
ビジネス |
不景気.com |
大阪の鉄筋工事「新東圧接」に破産開始決定 - 不景気.com |
https://www.fukeiki.com/2021/06/shinto-assetsu.html
|
大阪府大阪市 |
2021-06-24 16:50:37 |
北海道 |
北海道新聞 |
道内9市町で津波20m超 太平洋岸地震予測 苫小牧で浸水1万ha |
https://www.hokkaido-np.co.jp/article/559494/
|
千島海溝 |
2021-06-25 01:16:31 |
北海道 |
北海道新聞 |
高齢者の半数超、ワクチン1回接種 道内は全国最低の36% |
https://www.hokkaido-np.co.jp/article/559259/
|
新型コロナウイルス |
2021-06-25 01:15:39 |
北海道 |
北海道新聞 |
「ウィンドウズ11」公開 MS、6年ぶりOS更新 |
https://www.hokkaido-np.co.jp/article/559529/
|
更新 |
2021-06-25 01:13:00 |
GCP |
Cloud Blog |
Accelerate Google Cloud database migration assessments with EPAM’s migVisor |
https://cloud.google.com/blog/products/databases/faster-google-cloud-database-migration-assessments-with-epams-migvisor/
|
Accelerate Google Cloud database migration assessments with EPAM s migVisorEditor s note Today we re announcing the Database Migration Assessment and partnership with software development company EPAM allowing Google Cloud customers access to migVisor to conduct a database migration assessment Today we re announcing our latest offeringーthe Database Migration Assessmentーa Google Cloud led project to help customers accelerate their deployment to Google Cloud databases with a free evaluation of their environment A comprehensive approach to database migrationsIn Google Cloud continues to double down on its database migration and modernization strategy to help our customers de risk their journey to the cloud In this blog we share our comprehensive migration offering that includes people expertise processes and technology People Google Cloud s Database Migration and Modernization Delivery Center is led by Google Database Experts who have strong database migration skills and a deep understanding of how to deploy on Google Cloud databases for maximum performance reliability and improved total cost of ownership TCO Process We ve standardized an approach to assessing databases which streamlines migrating and modernizing data centric workloads This process shortens the duration of migrations and reduces the risk of migrating production databases Our migration methodology addresses priority use cases such as zero downtime heterogeneous and non intrusive serverless migrations This combined with a clear path to database optimization usingCloud SQL Insights gives customers a complete assessment to migration solution Technology Customers can use third party tools like migVisor to do assessments for free as well as use native Google Cloud tools like Database Migration Service DMS to de risk migrations and accelerate their biggest projects This assessment helped us de risk migration plans with phases focused on migration modernization and transformation The assessment output has become the source of truth for us and we continuously refer to it as we make plans for the future Vismay Thakkar VP of infrastructure BackcountryAccelerate database migration assessments with migVisor from EPAM To automate the assessment phase we ve partnered with EPAM a provider with strategic specialization in database and application modernization solutions Their Database Migration Assessment tool migVisor is a first of its kind cloud database migration assessment product that helps companies analyze database workloads and generate a visual cloud migration roadmap that identifies potential quick wins as well as areas of challenge migVisor will be made available to customers and partners allowing for the acceleration of migration timelines for Oracle Microsoft SQL Server PostgreSQL and MySQL databases to Google Cloud databases “We believe that by incorporating migVisor as part of our key solution offering for cloud database migrations and enabling our customers to leverage it early on in the migration process they can complete their migrations in a more cost effective optimized and successful way For us migVisor is a key differentiating factor when compared to other cloud providers Paul Miller Database Solutions Google CloudmigVisor helps identify the best migration path for each database using sophisticated scoring logic to rank databases according to the complexity of migrating to a cloud centric technology stack Users get a customized migration roadmap to help in planning Backcountry is one such customer who embraced migVisor by EPAM “Backcountry is on a technology upgrade cycle and is keen to realize the benefit of moving to a fully managed cloud database Google Cloud has been an awesome partner in helping us on this journey says Vismay Thakkar VP of infrastructure Backcountry “We used Google s offer for a complete Database Migration Assessment and it gave us a comprehensive understanding of our current deployment migration cost and time and post migration opex The assessment featured an automated process with rich migration complexity dashboards generated for individual databases with migVisor A smart approach to database modernizationWe know a customer s migration path from on premises databases to managed cloud database services ranges in complexity but even the most straightforward migration requires careful evaluation and planning Customer database environments often leverage database technologies from multiple vendors across different versions and can run into thousands of deployments This makes manual assessment cumbersome and error prone migVisor offers users a simple automated collection tool to analyze metadata across multiple database types assess migration complexity and provide a roadmap to carry out phased migrations thus reducing risk “Migrating out of commercial and expensive database engines is one of the key pillars and tangible incentive for reducing TCO as part of a cloud migration project says Yair Rozilio senior director of cloud data solutions EPAM “We created migVisor to overcome the bottleneck and lack of precision the database assessment process brings to most cloud migrations migVisor helps our customers easily identify which databases provide the quickest path to the cloud which enables companies to drastically cut on premises database licensing and operational expenses Get started today Using the Database Migration Assessment customers will be able to better plan migrations reduce risk and missteps identify quick wins for TCO reduction and review migration complexities and appropriately plan out the migration phases for best outcomes Learn more about the Database Migration Assessment and how it can help customers reduce the complexity of migrating databases to Google Cloud |
2021-06-24 17:00:00 |
コメント
コメントを投稿