投稿時間:2022-06-09 00:36:53 RSSフィード2022-06-09 00:00 分まとめ(45件)

カテゴリー等 サイト名等 記事タイトル・トレンドワード等 リンクURL 頻出ワード・要約等/検索ボリューム 登録日
IT 気になる、記になる… Apple、M2チップを搭載した14.1インチの新型iPadを開発中か https://taisy0.com/2022/06/08/157943.html apple 2022-06-08 14:30:56
IT 気になる、記になる… Microsoft、ロシアでの事業を大幅に縮小へ https://taisy0.com/2022/06/08/157941.html microsoft 2022-06-08 14:07:32
AWS AWS Netscout uses Amazon OpenSearch Service For Service Assurance | Amazon Web Services https://www.youtube.com/watch?v=YuApz9Oiv5U Netscout uses Amazon OpenSearch Service For Service Assurance Amazon Web ServicesNetscout is a global provider of service assurance and offers a threat intelligence platform called Omnis Threat Horizon built on AWS and using Amazon OpenSearch Service as an analytical engine A cost effective solution that scales rapidly and efficiently allows Netscout to present threat intelligence to their customers for free Because Amazon OpenSearch Service is open source Netscout can innovate very quickly Learn more at Subscribe More AWS videos More AWS events videos ABOUT AWSAmazon Web Services AWS is the world s most comprehensive and broadly adopted cloud platform offering over fully featured services from data centers globally Millions of customers ーincluding the fastest growing startups largest enterprises and leading government agencies ーare using AWS to lower costs become more agile and innovate faster awsanalytics cybersecurity opensearch managedelasticsearch serviceassurance securityanalytics AWS AmazonWebServices CloudComputing 2022-06-08 14:33:32
python Pythonタグが付けられた新着投稿 - Qiita Python 書籍 [2022年6月] https://qiita.com/netineti512/items/0e30d9868cc36428ffe5 jupyte 2022-06-08 23:51:05
python Pythonタグが付けられた新着投稿 - Qiita 【初投稿】Google Magenta でファイナルファンタジーの戦闘曲を学習させて新曲を作る試み https://qiita.com/Masanori_Aisaka/items/15f3653c0dcd17c895e3 googlemagenta 2022-06-08 23:47:50
python Pythonタグが付けられた新着投稿 - Qiita マジで0からDjango② 諸設定編 - settings.py を編集しよう https://qiita.com/aogumobc/items/8ac2751b0601d0c1477c managepy 2022-06-08 23:00:22
js JavaScriptタグが付けられた新着投稿 - Qiita Three.jsサンプル1 https://qiita.com/netineti512/items/cf83f2bd16199f36d890 gtlthtmlgtltheadgtltmetac 2022-06-08 23:10:52
Ruby Rubyタグが付けられた新着投稿 - Qiita 【Rails】rescue_fromによる例外処理 https://qiita.com/penpen22/items/5a3820a76cac163d3cde rails 2022-06-08 23:48:10
Ruby Rubyタグが付けられた新着投稿 - Qiita 【自分用】Rubyの基礎を振り返る。① https://qiita.com/hondano_gentuki/items/25d71553b200808cfc2a 開発 2022-06-08 23:03:54
AWS AWSタグが付けられた新着投稿 - Qiita Serverless Framework で DynamoDB のスキーマを変更した時に、変更のあったDBだけ更新する方法 https://qiita.com/naogify/items/36d5b5819a250f23509a dynamodb 2022-06-08 23:06:37
Azure Azureタグが付けられた新着投稿 - Qiita Azure Percept 基本 (一巡り) https://qiita.com/motoJinC25/items/7eb06deb782085a6532c azurepercept 2022-06-08 23:14:03
Ruby Railsタグが付けられた新着投稿 - Qiita 【Rails】rescue_fromによる例外処理 https://qiita.com/penpen22/items/5a3820a76cac163d3cde rails 2022-06-08 23:48:10
Ruby Railsタグが付けられた新着投稿 - Qiita RailsでTerser利用時にconsole.logを消したい https://qiita.com/dpkawa/items/d1de35e07ec6c1684c7b assetsjscompressorter 2022-06-08 23:15:33
技術ブログ Developers.IO [GitHub Actions] Snyk Node Actionを使ってCI Workflow上で脆弱性のチェックをしてみた https://dev.classmethod.jp/articles/github-actions-checking-for-vulnerabilities-on-ci-using-snyks-action/ actionssnyknodeaction 2022-06-08 14:28:01
海外TECH MakeUseOf What Is Rumble? https://www.makeuseof.com/what-is-rumble-video-platform/ platform 2022-06-08 14:45:14
海外TECH MakeUseOf How to Clean Your Windows PC Using Command Prompt https://www.makeuseof.com/windows-clean-files-command-prompt/ prompt 2022-06-08 14:40:14
海外TECH MakeUseOf How to Create Realistic Soap Bubbles in Procreate https://www.makeuseof.com/procreate-how-to-create-soap-bubbles/ procreate 2022-06-08 14:30:14
海外TECH MakeUseOf 5 Ways to Fix the “Invalid Partition Table” Error on Windows https://www.makeuseof.com/windows-invalid-partition-table-fix/ message 2022-06-08 14:15:14
海外TECH DEV Community How to provide an accessible high contrast alternative to a pastel color scheme? https://dev.to/ingosteinke/how-to-provide-an-accessible-high-contrast-alternative-to-a-pastel-color-scheme-396g How to provide an accessible high contrast alternative to a pastel color scheme Inspired by a discussion at a conference I decided to research and put the problem or rather challenge into proper words I even dared to ask it as a question on StackOverflow although most of my questions tend to get downvoted and deleted so better make a DEV blog post out of it as well How is it possible to ensure a website s color theme offers a high contrast alternative which complies to the WCAG minimum contrast requirements while preferring a pastel low contrast theme unless the user wants or needs higher contrast I tried to define a fallback theme with a higher contrast and providing a lower contrast version unless the user requires high contrast using the prefers contrast media query but the following example also as a codepen here which fails the accessibility audit by the axe Chrome extension due to the low contrast of of foreground color eeeeee background color f What CSS code is needed to define a proper fallback How are the users expected to indicate their contrast preference and is there a way to make browsers or operating systems adapt the contrast preference based on daylight settings dark light theme ambient light sensors or the like p background color f color eeeeee media prefers contrast more p background color aa color ffffff How to provide an accessible high contrast alternative to a pastel color scheme Jun Comments Answers How is it possible to ensure a website s color theme offers a high contrast alternative which complies to the WCAG minimum contrast requirements while preferring a pastel low contrast theme unless the user wants or needs higher contrast I tried to define a fallback theme with a higher contrast and providing… Open Full Question Thoughts and answers are very welcome 2022-06-08 14:39:46
海外TECH DEV Community Parsing logs from multiple data sources with Ahana and Cube https://dev.to/mikulskibartosz/parsing-logs-from-multiple-data-sources-with-ahana-and-cube-4jke Parsing logs from multiple data sources with Ahana and CubeAhana provides managed Presto clusters running in your AWS account Presto is an open source distributed SQL query engine originally developed at Facebook now hosted under the Linux Foundation It connects to multiple databases or other data sources for example Amazon S We can use a Presto cluster as a single compute engine for an entire data lake Presto implements the data federation feature you can process data from multiple sources as if they were stored in a single database Because of that you don t need a separate ETL Extract Transform Load pipeline to prepare the data before using it However running and configuring a single point of access for multiple databases or file systems requires Ops skills and an additional effort However no data engineer wants to do the Ops work Using Ahana you can deploy a Presto cluster within minutes without spending hours configuring the service VPCs and AWS access rights Ahana hides the burden of infrastructure management and allows you to focus on processing your data What is Cube Cube is a headless BI platform for accessing organizing and delivering data Cube connects to many data warehouses databases or query engines including Presto and allows you to quickly build data applications or analyze your data in BI tools It serves as the single source of truth for your business metrics This article will demonstrate the caching functionality access control and flexibility of the data retrieval API IntegrationCube s battle tested Presto driver provides the out of the box connectivity to Ahana You just need to provide the credentials Presto host name and port user name and password Presto catalog and schema You ll also need to set CUBEJS DB SSL to true since Ahana has secures Presto connections with SSL Check the docs to learn more about connecting Cube to Ahana Example Parsing logs from multiple data sources with Ahana and CubeLet s build a real world data application with Ahana and Cube We will use Ahana to join Amazon Sagemaker Endpoint logs stored as JSON files in S with the data retrieved from a PostgreSQL database Suppose you work at a software house specializing in training ML models for your clients and delivering ML inference as a REST API You have just trained new versions of all models and you would like to demonstrate the improvements to the clients Because of that you do a canary deployment of the versions and gather the predictions from the new and the old models using the built in logging functionality of AWS Sagemaker Endpoints a managed deployment environment for machine learning models Additionally you also track the actual production values provided by your clients You need all of that to prepare personalized dashboards showing the results of your hard work Let us show you how Ahana and Cube work together to help you achieve your goal quickly without spending days reading cryptic documentation You will retrieve the prediction logs from an S bucket and merge them with the actual values stored in a PostgreSQL database After that you calculate the ML performance metrics implement access control and hide the data source complexity behind an easy to use REST API In the end you want a dashboard looking like this How to configure Ahana Allowing Ahana to access your AWS accountFirst let s login to Ahana and connect it to your AWS account We must create an IAM role allowing Ahana to access our AWS account On the setup page click the “Open CloudFormation button After clicking the button we get redirected to the AWS page for creating a new CloudFormation stack from a template provided by Ahana Create the stack and wait until CloudFormation finishes the setup When the IAM role is configured click the stack s Outputs tab and copy the AhanaCloudProvisioningRole key value We have to paste it into the Role ARN field on the Ahana setup page and click the “Complete Setup button Creating an Ahana clusterAfter configuring AWS access we have to start a new Ahana cluster In the Ahana dashboard click the “Create new cluster button In the setup window we can configure the type of the AWS EC instances used by the cluster scaling strategy and the Hive Metastore If you need a detailed description of the configuration options look at the “Create new cluster section of the Ahana documentation Remember to add at least one user to your cluster When we are satisfied with the configuration we can click the “Create cluster button Ahana needs around minutes to setup a new cluster Retrieving data from S and PostgreSQLAfter deploying a Presto cluster we have to connect our data sources to the cluster because in this example the Sagemaker Endpoint logs are stored in S and PostgreSQL Adding a PostgreSQL database to AhanaIn the Ahana dashboard click the “Add new data source button We will see a page showing all supported data sources Let s click the “Amazon RDS for PostgreSQL option In the setup form displayed below we have to provide the database configuration and click the “Add data source button Adding an S bucket to AhanaAWS Sagemaker Endpoint stores their logs in an S bucket as JSON files To access those files in Presto we need to configure the AWS Glue data catalog and add the data catalog to the Ahana cluster We have to login to the AWS console open the AWS Glue page and add a new database to the data catalog or use an existing one Now let s add a new table We won t configure it manually Instead let s create a Glue crawler to generate the table definition automatically On the AWS Glue page we have to click the “Crawlers link and click the “Add crawler button After typing the crawler s name and clicking the “Next button we will see the Source Type page On this page we have to choose the “Data stores and “Crawl all folders in our case “Crawl new folders only would work too On the “Data store page we pick the S data store select the S connection or click the “Add connection button if we don t have an S connection configured yet and specify the S path Note that Sagemaker Endpoints store logs in subkeys using the following key structure endpoint name model variant year month day hour We want to use those parts of the key as table partitions Because of that if our Sagemaker logs have an S key s the bucket name sagemaker logs endpoint name model variant name year month day hour we put only the s the bucket name sagemaker logs key prefix in the setup window Let s click the “Next button In the subsequent window we choose “No when asked whether we want to configure another data source Glue setup will ask about the name of the crawler s IAM role We can create a new one Next we configure the crawler s schedule A Sagemaker Endpoint adds new log files in near real time Because of that it makes sense to scan the files and add new partitions every hour In the output configuration we need to customize the settings First let s select the Glue database where the new tables get stored After that we modify the “Configuration options We pick the “Add new columns only because we will make manual changes in the table definition and we don t want the crawler to overwrite them Also we want to add new partitions to the table so we check the “Update all new and existing partitions with metadata from the table box Let s click “Next We can check the configuration one more time in the review window and click the “Finish button Now we can wait until the crawler runs or open the AWS Glue Crawlers view and trigger the run manually When the crawler finishes running we go to the Tables view in AWS Glue and click the table name In the table view we click the “Edit table button and change the “Serde serialization lib to “org apache hive hcatalog data JsonSerDe because the AWS JSON serialization library isn t available in the Ahana Presto cluster We should also click the “Edit schema button and change the default partition names to values shown in the screenshot below After saving the changes we can add the Glue data catalog to our Ahana Presto cluster Configuring data sources in the Presto clusterGo back to the Ahana dashboard and click the “Add data source button Select the “AWS Glue Data Catalog for Amazon S option in the setup form Let s select our AWS region and put the AWS account id in the “Glue Data Catalog ID field After that we click the “Open CloudFormation button and apply the template We will have to wait until CloudFormation creates the IAM role When the role is ready we copy the role ARN from the Outputs tab and paste it into the “Glue S Role ARN field On the Ahana setup page we click the “Add data source button Adding data sources to an existing clusterFinally we can add both data sources to our Ahana cluster We have to open the Ahana “Clusters page click the “Manage button and scroll down to the “Data Sources section In this section we click the “Manage data sources button We will see another setup page where we check the boxes next to the data sources we want to configure and click the “Modify cluster button We will need to confirm that we want to restart the cluster to make the changes Writing the Presto queriesBefore configuring Cube let s write the Presto queries to retrieve the data we want The actual structure of the input and output from an AWS Sagemaker Endpoint depends on us We can send any JSON request and return a custom JSON object Let s assume that our endpoint receives a request containing the input data for the machine learning model and a correlation id We will need those ids to join the model predictions with the actual data Example input time series … correlation id cfbba ba fe abca In the response the model returns a JSON object with a single “prediction key and a decimal value prediction A single request in Sagemaker Endpoint logs looks like this captureData endpointInput observedContentType application json mode INPUT data eyJaWlXNlcmllcyIIFsMSMjMMjAzODYxNTAzODUsIDMLjUwOTkODcMTYwNzMLCAzNiNTkMzIOTQNjAwNTYsIDYLjAyMTUMzEyNjYyNDgLCAMCzMjkwMzUMDgyMjIwODUsIDIyLjkMDgMjgxNDgMzExLCANCMjQxNTUMTEMTQyOCwgMzkuMDMNzAMjgODcODALCAyMCNzQNjkOTMMzAxMTUsIDQLjcMzYMDQMjIMDINSwgMzcuNTgxMDYzNzUyNjYNTELCAOCxMTcMzQNjENDMOCwgMzYuODgwNzExNTAyNDIxMywgMzkuNzEMjgNTMNzYODksIDUxLjkxMDYxODYyNzgODYyLCAOSMzkMjQwMTQNDMOCwgNDIuODMOTAMDIxMDkwMzksIDILjYwOTUMTYMDYyNzkzLCAzOSMDczNzUNDQwODYyOCwgMzUuMTAOTQMzINjQwOFsICJjbJyZWxhdGlvblpZCIICJjZjhiNIYSYjhhLTQZmUtOTgxNCxMWEYjEYzcxMGEifQ encoding BASE endpointOutput observedContentType application json mode OUTPUT data eyJwcmVkaWNaWuIjogMjEuMjYMTQNjENDQOTUfQ encoding BASE eventMetadata eventId ba fbc fa cedbe inferenceTime T Z AWS Sagemaker Endpoints encode the request and response using base Our query needs to decode the data before we can process it Because of that our Presto query starts with data decoding with sagemaker as select model name variant name cast json extract FROM UTF from base capturedata endpointinput data correlation id as varchar as correlation id cast json extract FROM UTF from base capturedata endpointoutput data prediction as double as prediction from s sagemaker logs logs actual as select correlation id actual value from postgresql public actual values After that we join both data sources and calculate the absolute error value logs as select model name variant name as model variant sagemaker correlation id prediction actual value as actual from sagemaker left outer join actual on sagemaker correlation id actual correlation id errors as select abs prediction actual as abs err model name model variant from logs Now we need to calculate the percentiles using the approx percentile function Note that we group the percentiles by model name and model variant Because of that Presto will produce only a single row per every model variant pair That ll be important when we write the second part of this query percentiles as select approx percentile abs err as perc approx percentile abs err as perc approx percentile abs err as perc approx percentile abs err as perc approx percentile abs err as perc approx percentile abs err as perc approx percentile abs err as perc approx percentile abs err as perc approx percentile abs err as perc approx percentile abs err as perc model name model variant from errors group by model name model variant In the final part of the query we will use the filter expression to count the number of values within buckets Additionally we return the bucket boundaries We need to use an aggregate function max or any other aggregate function because of the group by clause That won t affect the result because we returned a single row per every model variant pair in the previous query SELECT count FILTER WHERE e abs err lt perc AS perc max perc as perc value count FILTER WHERE e abs err gt perc and e abs err lt perc AS perc max perc as perc value count FILTER WHERE e abs err gt perc and e abs err lt perc AS perc max perc as perc value count FILTER WHERE e abs err gt perc and e abs err lt perc AS perc max perc as perc value count FILTER WHERE e abs err gt perc and e abs err lt perc AS perc max perc as perc value count FILTER WHERE e abs err gt perc and e abs err lt perc AS perc max perc as perc value count FILTER WHERE e abs err gt perc and e abs err lt perc AS perc max perc as perc value count FILTER WHERE e abs err gt perc and e abs err lt perc AS perc max perc as perc value count FILTER WHERE e abs err gt perc and e abs err lt perc AS perc max perc as perc value count FILTER WHERE e abs err gt perc and e abs err lt perc AS perc max perc as perc value p model name p model variantFROM percentiles p errors e group by p model name p model variant How to configure Cube In our application we want to display the distribution of absolute prediction errors We will have a chart showing the difference between the actual value and the model s prediction Our chart will split the absolute errors into buckets percentiles and display the number of errors within every bucket If the new variant of the model performs better than the existing model we should see fewer large errors in the charts A perfect and unrealistic model would produce a single error bar in the left most part of the chart with the “ label At the beginning of the article we looked at an example chart that shows no significant difference between both model variants If the variant B were better than the variant A its chart could look like this note the axis values in both pictures Creating a Cube deploymentCube Cloud is the easiest way to get started with Cube It provides a fully managed ready to use Cube cluster However if you prefer self hosting then follow this tutorial First please create a new Cube Cloud deployment Then open the Deployments page and click the “Create deployment button We choose the Presto cluster Finally we fill out the connection parameters and click the “Apply button Remember to enable the SSL connection Defining the data model in CubeWe have our queries ready to copy paste and we have configured a Presto connection in Cube Now we can define the Cube schema to retrieve query results Let s open the Schema view in Cube and add a new file In the next window type the file name errorpercentiles js and click “Create file In the following paragraphs we will explain parts of the configuration and show you code fragments to copy paste You don t have to do that in such small steps Below you see the entire content of the file Later we explain the configuration parameters const measureNames perc perc value perc perc value perc perc value perc perc value perc perc value perc perc value perc perc value perc perc value perc perc value perc perc value const measures Object keys measureNames reduce result name gt const sqlName measureNames name return result sqlName sql gt sqlName type max cube errorpercentiles sql with sagemaker as select model name variant name cast json extract FROM UTF from base capturedata endpointinput data correlation id as varchar as correlation id cast json extract FROM UTF from base capturedata endpointoutput data prediction as double as prediction from s sagemaker logs logs actual as select correlation id actual value from postgresql public actual values logs as select model name variant name as model variant sagemaker correlation id prediction actual value as actual from sagemaker left outer join actual on sagemaker correlation id actual correlation id errors as select abs prediction actual as abs err model name model variant from logs percentiles as select approx percentile abs err as perc approx percentile abs err as perc approx percentile abs err as perc approx percentile abs err as perc approx percentile abs err as perc approx percentile abs err as perc approx percentile abs err as perc approx percentile abs err as perc approx percentile abs err as perc approx percentile abs err as perc model name model variant from errors group by model name model variant SELECT count FILTER WHERE e abs err lt perc AS perc max perc as perc value count FILTER WHERE e abs err gt perc and e abs err lt perc AS perc max perc as perc value count FILTER WHERE e abs err gt perc and e abs err lt perc AS perc max perc as perc value count FILTER WHERE e abs err gt perc and e abs err lt perc AS perc max perc as perc value count FILTER WHERE e abs err gt perc and e abs err lt perc AS perc max perc as perc value count FILTER WHERE e abs err gt perc and e abs err lt perc AS perc max perc as perc value count FILTER WHERE e abs err gt perc and e abs err lt perc AS perc max perc as perc value count FILTER WHERE e abs err gt perc and e abs err lt perc AS perc max perc as perc value count FILTER WHERE e abs err gt perc and e abs err lt perc AS perc max perc as perc value count FILTER WHERE e abs err gt perc and e abs err lt perc AS perc max perc as perc value p model name p model variantFROM percentiles p errors e group by p model name p model variant preAggregations Pre Aggregations definitions go here Learn more here joins measures measures dimensions modelVariant sql model variant type string modelName sql model name type string In the sql property we put the query prepared earlier Note that your query MUST NOT contain a semicolon We will group and filter the values by the model and variant names so we put those columns in the dimensions section of the cube configuration The rest of the columns are going to be our measurements We can write them out one by one like this measures perc sql perc type max perc sql perc type max perc sql perc type max perc sql perc type max perc sql perc type max perc sql perc type max perc sql perc type max perc sql perc type max perc sql perc type max perc sql perc type max perc value sql perc value type max perc value sql perc value type max perc value sql perc value type max perc value sql perc value type max perc value sql perc value type max perc value sql perc value type max perc value sql perc value type max perc value sql perc value type max perc value sql perc value type max perc value sql perc value type max dimensions modelVariant sql model variant type string modelName sql model name type string The notation we have shown you has lots of repetition and is quite verbose We can shorten the measurements defined in the code by using JavaScript to generate them We had to add the following code before using the cube function First we have to create an array of column names const measureNames perc perc value perc perc value perc perc value perc perc value perc perc value perc perc value perc perc value perc perc value perc perc value perc perc value Now we must generate the measures configuration object We iterate over the array and create a measure configuration for every column const measures Object keys measureNames reduce result name gt const sqlName measureNames name return result sqlName sql gt sqlName type max Finally we can replace the measure definitions with measures measures After changing the file content click the “Save All button And click the Continue button in the popup window In the Playground view we can test our query by retrieving the chart data as a table or one of the built in charts Configuring access control in CubeIn the Schema view open the cube js file We will use the queryRewrite configuration option to allow or disallow access to data First we will reject all API calls without the models field in the securityContext We will put the identifier of the models the user is allowed to see in their JWT token The security context contains all of the JWT token variables For example we can send a JWT token with the following payload Of course in the application sending queries to Cube we must check the user s access right and set the appropriate token payload Authentication and authorization are beyond the scope of this tutorial but please don t forget about them After rejecting unauthorized access we add a filter to all queries We can distinguish between the datasets accessed by the user by looking at the data specified in the query We need to do it because we must filter by the modelName property of the correct table In our queryRewrite configuration in the cube js file we use the query filter push function to add a modelName IN model model … clause to the SQL query module exports queryRewrite query securityContext gt if securityContext models throw new Error No models found in Security Context query filters push member percentiles modelName operator in values securityContext models return query Configuring caching in CubeBy default Cube caches all Presto queries for minutes Even though Sagemaker Endpoints stores logs in S in near real time we aren t interested in refreshing the data so often Sagemaker Endpoints store the logs in JSON files so retrieving the metrics requires a full scan of all files in the S bucket When we gather logs over a long time the query may take some time Below we will show you how to configure the caching in Cube We recommend doing it when the end user application needs over one second to load the data For the sake of the example we will retrieve the value only twice a day Preparing data sources for cachingFirst we must allow Presto to store data in both PostgreSQL and S It s required because in the case of Presto Cube supports only the simple pre aggregation strategy Therefore we need to pre aggregate the data in the source databases before loading them into Cube In PostgreSQL we grant permissions to the user account used by Presto to access the database GRANT CREATE ON SCHEMA the schema we use TO the user used in presto GRANT USAGE ON SCHEMA the schema we use TO the user used in presto If we haven t modified anything in the AWS Glue data catalog Presto already has permission to create new tables and store their data in S but the schema doesn t contain the target S location yet so all requests will fail We must login to AWS Console open the Glue data catalog and create a new database called prod pre aggregations In the database configuration we must specify the S location for the table content If you want to use a different database name follow the instructions in our documentation Caching configuration in CubeLet s open the errorpercentiles js schema file Below the SQL query we put the preAggregations configuration preAggregations cacheResults type rollup measures errorpercentiles perc errorpercentiles perc value errorpercentiles perc errorpercentiles perc value errorpercentiles perc errorpercentiles perc value errorpercentiles perc errorpercentiles perc value errorpercentiles perc errorpercentiles perc value errorpercentiles perc errorpercentiles perc value errorpercentiles perc errorpercentiles perc value errorpercentiles perc errorpercentiles perc value errorpercentiles perc errorpercentiles perc value errorpercentiles perc errorpercentiles perc value dimensions errorpercentiles modelName errorpercentiles modelVariant refreshKey every hour After testing the development version we can also deploy the changes to production using the “Commit amp Push button When we click it we will be asked to type the commit message When we commit the changes the deployment of a new version of the endpoint will start A few minutes later we can start sending queries to the endpoint We can also check the pre aggregations window to verify whether Cube successfully created the cached data Now we can move to the Playground tab and run our query We should see the “Query was accelerated with pre aggregation message if Cube used the cached values to handle the request Building the front end applicationCube can connect to a variety of tools including Jupyter Notebooks Superset and Hex However we want a fully customizable dashboard so we will build a front end application Our dashboard consists of two parts the website and the back end service In the web part we will have only the code required to display the charts In the back end we will handle authentication and authorization The backend service will also send requests to the Cube REST API Getting the Cube API key and the API URLBefore we start we have to copy the Cube API secret Open the settings page in Cube Cloud s web UI and click the “Env vars tab In the tab you will see all of the Cube configuration variables Click the eye icon next to the CUBEJS API SECRET and copy the value We also need the URL of the Cube endpoint To get this value click the “Copy API URL link in the top right corner of the screen Back end for front endNow we can write the back end code First we have to authenticate the user We assume that you have an authentication service that verifies whether the user has access to your dashboard and which models they can access In our examples we expect those model names in an array stored in the allowedModels variable After getting the user s credentials we have to generate a JWT to authenticate Cube requests Note that we have also defined a variable for storing the CUBE URL Put the URL retrieved in the previous step as its value ​​const jwt require jsonwebtoken CUBE URL function create cube token const CUBE API SECRET your token Don t store it in the code Pass it as an environment variable at runtime or use the secret management feature of your container orchestration system const cubejsToken jwt sign models allowedModels CUBE API SECRET expiresIn d return cubejsToken We will need two endpoints in our back end service the endpoint returning the chart data and the endpoint retrieving the names of models and variants we can access We create a new express application running in the node server and configure the models endpoint const request require request const express require express const bodyParser require body parser const port const app express app use bodyParser json app get models getAvailableModels app listen port gt console log Server is running on port port In the getAvailableModels function we query the Cube Cloud API to get the model names and variants It will return only the models we are allowed to see because we have configured the Cube security context Our function returns a list of objects containing the modelName and modelVariant fields function getAvailableModels req res res setHeader Content Type application json request post CUBE URL load headers Authorization create cube token Content Type application json body JSON stringify query dimensions errorpercentiles modelName errorpercentiles modelVariant timeDimensions order errorpercentiles modelName asc err res body gt if err console log err body JSON parse body response body data map item gt return modelName item errorpercentiles modelName modelVariant item errorpercentiles modelVariant res send JSON stringify response Let s retrieve the percentiles and percentile buckets To simplify the example we will show only the query and the response parsing code The rest of the code stays the same as in the previous endpoint The query specifies all measures we want to retrieve and sets the filter to get data belonging to a single model s variant We could retrieve all data at once but we do it one by one for every variant query measures errorpercentiles perc errorpercentiles perc errorpercentiles perc errorpercentiles perc errorpercentiles perc errorpercentiles perc errorpercentiles perc errorpercentiles perc errorpercentiles perc errorpercentiles perc errorpercentiles perc value errorpercentiles perc value errorpercentiles perc value errorpercentiles perc value errorpercentiles perc value errorpercentiles perc value errorpercentiles perc value errorpercentiles perc value errorpercentiles perc value errorpercentiles perc value dimensions errorpercentiles modelName errorpercentiles modelVariant filters member errorpercentiles modelName operator equals values req query model member errorpercentiles modelVariant operator equals values req query variant The response parsing code extracts the number of values in every bucket and prepares bucket labels response body data map item gt return modelName item errorpercentiles modelName modelVariant item errorpercentiles modelVariant labels lt item percentiles perc value item errorpercentiles perc value item errorpercentiles perc value item errorpercentiles perc value item errorpercentiles perc value item errorpercentiles perc value item errorpercentiles perc value item errorpercentiles perc value item errorpercentiles perc value gt item errorpercentiles perc value values item errorpercentiles perc item errorpercentiles perc item errorpercentiles perc item errorpercentiles perc item errorpercentiles perc item errorpercentiles perc item errorpercentiles perc item errorpercentiles perc item errorpercentiles perc item errorpercentiles perc Dashboard websiteIn the last step we build the dashboard website using Vue js If you are interested in copy pasting working code we have prepared the entire example in a CodeSandbox Below we explain the building blocks of our application We define the main Vue component encapsulating the entire website content In the script section we will download the model and variant names In the template we iterate over the retrieved models and generate a chart for all of them We put the charts in the Suspense component to allow asynchronous loading To keep the example short we will skip the CSS style part ​​ lt script setup gt import OwnerName from components OwnerName vue import ChartView from components ChartView vue import axios from axios import ref from vue const models ref axios get SERVER URL models then response gt models value response data lt script gt lt template gt lt header gt lt div class wrapper gt lt OwnerName name Test Inc gt lt div gt lt header gt lt main gt lt div v for model in models v bind key model modelName gt lt Suspense gt lt ChartView v bind title model modelName v bind variant model modelVariant type percentiles gt lt Suspense gt lt div gt lt main gt lt template gt The ​​OwnerName component displays our client s name We will skip its code as it s irrelevant in our example In the ChartView component we use the vue chartjs library to display the charts Our setup script contains the required imports and registers the Chart js components ​​​​import Bar from vue chartjs import Chart as ChartJS Title Tooltip Legend BarElement CategoryScale LinearScale from chart js import ref from vue import axios from axios ChartJS register Title Tooltip Legend BarElement CategoryScale LinearScale We have bound the title variant and chart type to the ChartView instance Therefore our component definition must contain those properties const props defineProps title String variant String type String Next we retrieve the chart data and labels from the back end service We will also prepare the variable containing the label text ​​const response await axios get SERVER URL props type model props title amp variant props variant const data response data values const labels response data labels const label text Number of prediction errors of a given value Finally we prepare the chart configuration variables const chartData ref labels labels datasets label label text backgroundColor f data data const chartOptions plugins title display true text props title props variant legend display false tooltip enabled false In the template section of the Vue component we pass the configuration to the Bar instance lt template gt lt Bar ref chart v bind chart data chartData v bind chart options chartOptions gt lt template gt If we have done everything correctly we should see a dashboard page with error distributions Wrapping upThanks for following this tutorial We encourage you to spend some time reading the Cube and Ahana documentation Please don t hesitate to like and bookmark this post write a comment give Cube a star on GitHub join Cube s Slack community and subscribe to the Ahana newsletter 2022-06-08 14:31:39
海外TECH DEV Community So what? https://dev.to/tigt/so-what-c8j So what Last time I promised to write about “getting the benefits that SPAs enjoy without suffering the consequences they extremely don t enjoy And then Nolan Lawson wrote basically that and then the madlad did it again He included almost everything I would ve MPA pageloads are surprisingly tough to beat nowadaysPaint holding streaming HTML cross page code caching back forward caching etc Service Worker renderingAlso see Jeremy Wagner on why offline first MPAs are coolIn theory MPA page transitions are Real Soon NowIn practice Kroger com had none and our native app barely had any so I didn t careAnd his main point If the only reason you re using an SPA is because “it makes navigations faster then maybe it s time to re evaluate that I don t think he talked about how edge rendering and MPAs are good buds but I mentioned it so here s ticking that box Since Nolan said what I would ve in less words I ll cut to the chase did my opinions in this series make a meaningfully fast site This is the part where I put my money where my mouth was Proving that speed mattered wasn t enough we also had to convince people emotionally To show everyone god dammit how much better our site would be if it were fast The best way to get humans to feel something is to have them experience it Is our website painful on the phones we sell Time to inflict some pain The demoI planned to demonstrate the importance of speed at our monthly product meeting It went a little something like this Buy enough Poblano phones for attendees On those phones and a throttled connection try using Kroger com Log inSearch for “eggs Add some to cartTry to check outRepeat those steps on the demo Note how performance is the bedrock feature without it no other features exist Near the laptop with the horrible pun are of the original demo phones The KaiOS flip phone helped stop me from overspecializing for Chrome or the Poblano VLE specs A nice thing about targeting wimpy phones is that the demo hardware cost me relatively little Each Poblano was ≈ and a sale at the time knocked some down to How fast was it Sadly I can t give you a demo so this video will have to suffice The browser UI differs because each video was recorded at different times Also guess walmart com s framework Text description of video Me racing to get the demo kroger com Kroger s native app amazon com and walmart com all to the start of checkout as fast as possible In order of how long each took The demo in secondsamazon com in secondsThe Kroger native app in minute secondswalmart com in minutes secondskroger com in minutes secondsIf you want to improve dev to s video accessibility consider voting for my feature request for the lt track gt element For a bit our CDN contact got it semi public on the real Internet I was beyond excited to see this in AmeliaBR s Firefox devtools That s Cincinnati Ohio →Edmonton Canada milliseconds ain t bad for a network response but I was so happy because I knew we could get much faster…About ms was from geographical distance which can be improved by edge rendering caching etc PCF s gorouters have a ms delay Luckily we were dropping PCF ms from Nagle s algorithm maybe even ms from both Node js and the reverse proxy This is what TCP NODELAY is for Tweaked gzip brotli compression like their buffer sizes and flushing behaviorLower latency HTTPS configuration such as smaller TLS record sizesLet s say that averages out to ms in the real world Based on the numbers in the first post that s million year based on kroger com s TTFB today Or of company profit at the time The actual number would probably be higher With a difference this large latency→revenue stops being linear So…how d it go Or as Jason Grigsby put it The burning questions are related to how it performed and what the organization thought about it How much was adopted Etc What did the organization think of it The immediate reaction exceeded even my most indulgent expectations Only the sternest Dad Voice in the room could get enough quiet to finish the presentation Important people stood up to say they d like to see more bottom up initiative like it VIPs who didn t attend requested demos Even some developers who disagreed with me on React and web performance admitted they were intrigued Which was nice but kroger com was still butt slow As far as how to learn anything from the demo I think these were the options Adapt new principles to existing codeRewrite incremental or not Separate MVP Adapt new principles to kroger com s existing code Naturally folks asked how to get our current React SSR architecture to be fast like the demo And that s fine Why not React Why not compromise and improve the existing site We tried it Developers toiled in the Webpack mines for smaller bundles We dropped IE to polyfill less We changed the footer to static HTML After months of effort we shrank our JS bundle by ≈ One month later we were back where we started Does that mean fast websites are too hard in React C mon that s a clickbait question impossible to answer But it was evidence that we as a company couldn t handle ongoing development in a React SPA architecture without constant site speed casualties Maybe it was for management reasons or education reasons but after this cycle repeated a few times a fair conclusion was we couldn t hack it When every new feature adds client side JS it felt like we were set up to lose before we even started Try telling a business that each new feature must replace an existing one See how far you get At some point I was asked to write a cost benefit analysis for the MPA architecture that made the demo fast but in React It s long enough I can t repeat it here so instead I ll do a Classic Internet Move gloss a nuanced topic into controversial points Reasons not to use React for Multi Page AppsReact server renders HTML slower than many other frameworks languagesIf you re server rendering much more frequently even small differences add up And the differences aren t that small React is kind of bad at page loadsreact react dom are bigger than many frameworks and its growth trendline is disheartening In theory React pages can be fast In practice they rarely are VDOM is not the architecture you d design if you wanted fast loads Its rehydration annoys users does lots of work at the worst possible time and is fragile and hard to reason about Do you want those risks on each page ️Okay I feel like I have to back this one up at least Performance metrics collected from real websites using SSR rehydration indicate its use should be heavily discouraged Ultimately the reason comes down to User Experience it s extremely easy to end up leaving users in an “uncanny valley ー Rendering on the Web §A Rehydration Problem One App for the Price of TwoThe Virtual DOM approach inflicts a lot of overhead at page load Render the entire component treeRead back the existing DOMDiff the twoRender the reconciled component treeThat s a lot of unnecessary work if you re going to show something mostly identical to the initial text html response Forget the performance for a second Even rehydrating correctly in React is tricky so using it for an MPA risks breakage on every page Why Server Side Rendering In React Is So HardThe Perils of RehydrationCase study of SSR with React in a large e commerce appFixing Gatsby s rehydration issuegatsbyjs Discussion Gatsby React amp HydrationReact bugs for “Server Rendering No really skim those links The nature of their problems is more important than the specifics React fights the multi page mental modelIt prefers JS properties to HTML attributes you know the class vs className thing That s not a dealbreaker but it s symptomatic Server side React and its ecosystem strive to pretend they re in a browser Differences between server and browser renders are considered isomorphic failures that should be fixed React promises upcoming ways to address these problems but testing benching and speculating on them would be a whole other post They also extremely didn t exist two years ago I m not thrilled about how React s upcoming streaming and partial hydration seem to be implemented ーI should test for due diligence but a separate HTTP connection for a not quite JSON stream doesn t seem like it would play nice during page load Taking it back to my goals does Facebook even use React for its rural low spec poorly connected customers There is one data point of the almost no JS mbasic facebook com Rewrite kroger com incrementally or not Software rewrites are the Forever Joke Developers say this will be the last rewrite because finally we know how to do it right Businesses meanwhile knowingly estimate how long each codebase will last based on how wrong the developers were in the past Therefore the natural question should our next inevitable rewrite be Marko I was able to pitch my approach vs another for internal R amp D I can t publish specifics but I did make this inscrutable poster for it And because I m an incorrigible web developer I made it with HTML amp CSS That bakeoff s official conclusion “performance is an application concern not the platform s fault It was decided to target Developer Experiencefor the long term not site speed I was secretly relieved how likely will a new architecture actually be faster if it s put through the same people processes and culture as the last architecture With the grand big bang rewrite successfully avoided we could instead try small incremental improvements ーspeed A B tests If successful that s reason enough to try further improvements and if those were successful…The simplest thing that could possibly work seemed to be streaming static asset lt script gt and lt link gt elements before the rest of the HTML We d rewrite the outer scaffolding HTML in Marko then embed React into the dynamic parts of the page Here s a simplified example of what I mean import renderReactRoot fetchDataDependencies from react app lt doctype html gt lt html lang en us gt lt head gt lt meta charset utf gt lt meta name viewport content width device width initial scale gt lt for url of input webpackStaticAssets gt lt if url endsWith js gt lt script defer src url gt lt script gt lt if gt lt if url endsWith css gt lt link rel stylesheet href url gt lt if gt lt for gt lt PageMetadata input request gt lt head gt lt body gt lt await fetchDataDependencies input request input response gt lt then data gt renderReactRoot data lt then gt lt await gt lt body gt lt html gt This had a number of improvements Browsers could download and parse our static assets while the server waited on dynamic data and React SSR Since Marko only serializes components with state the outer HTML didn t add to our JS bundle This had more impact than the above example suggests our HTML scaffolding was more complicated because it was a Real Codebase If successful we could rewrite components from the outside in shrinking the bundle with each step Marko also paid for itself with more efficient SSR and smaller HTML output quote stripping tag omission etc so we didn t regress server metrics unless we wanted to This almost worked But we were thwarted by our Redux code Our Reducers n Friends contained enough redirect page metadata analytics business logic that assumed the entire page would be sent all at once where any code could walk back up the DOM at its leisure and change previously generated HTML…like the lt head gt We tried to get dev time to overcome this problem since we d have to make Redux stream friendly in a React world anyway Unfortunately Redux and its ecosystem weren t designed with streaming in mind so assigning enough dev time to overcome those obstacles was deemed “not product led enough Launch a separate faster version of kroger com While the “make React do this attempts and the Streaming A B test were you know fine they weren t my favorite options I favored launching a separate low spec site with respectful redirects ーlet s call it I liked this approach because…Minimum time it took for real people to benefit from a significant speedupHelped with the culture paradox your existing culture gave you the current site Pushing a new approach through that culture will change your current culture or the result and the likelihood of which depends on how many people it has to go through A small team with its own goals can incubate its own culture to achieve those goals If it s a big enough success it can run on its own results while accruing features until the question “should we swap over becomes an obvious yes no How much was adopted Well…that s a long story The Performance team got rolled into the Web Platform team That had good intentions but in retrospect a platform team s high urgency deploys monitoring and incident responses inevitably crowd out important but low urgency speed improvement work Many folks were also taken with the idea of a separate faster site They volunteered skills and time to estimate the budget set up CI CD and other favors Their effort kindness and optimism amazed me It seemed inevitable that something would happen ーat least we d get a concrete rejection that could inform what we tried next The good news something did happen The bad news it was the USA Spring lockdown After the initial shock I realized I was in a unique position COVID made it extremely dangerous to enter supermarkets The pandemic was disproportionately hurting blue collar jobs high risk folks and the homeless I had a proof of concept where even cheap and or badly connected devices can quickly browse buy and order groceries online People won t stop buying food or medicine even with stay at home orders If we had a website that let even the poorest shop without stepping in our stores it would save lives Even if they could only browse it would still cut down on in store time With a certainty of purpose I ve never felt before or since I threw myself into making a kroger but fast MVP I knew it was asking for burnout but I also knew I d regret any halfheartedness for the rest of my life ーit would have been morally wrong not to try We had the demo running in a prod bucket agonizingly almost public only one secret login away We tried to get anyone internally to use it to buy groceries I m not sure anyone bothered I don t know what exactly happened My experience was very similar to Zack Argyle s with Pinterest Lite without the happy ending It took him years so maybe I m just impatient I was a contractor not a “real employee so I wasn t privy to internal decisions ーthis also meant I couldn t hear why any of the proposals sent up the chain got lost or rejected Once it filtered through the grapevine that Bridge maybe was competing for resources with a project like this…that was when I decided I was doing nothing but speedrunning hypertension by staying When bad things happen to fast codeOn the one hand the complete lack of real change is obvious The demo intentionally rejected much of our design development and even management decisions to get the speed it needed Some sort of skunkworks to insulate from ambient organizational pressures is often the only way a drastic improvement like this can work and it s hard getting clearance for that Another reason is that to make a drastic improvement on an existing product there s an inherent paradox a lot of folks jobs depend on that product and you can t get someone to believe something they re paid not to believe Especially when the existing architecture was sold as faster than the even more previous one And isn t that always the case It took me a while to understand how people could be personally enthusiastic but professionally could do nothing One thing that helped was Quotes from Moral Mazes Or if you want a link less likely to depress you I was trying to make a Level project happen in an org that could charitably be described as Level But enough about me What about you Maybe you re making a website that needs to be fast The first thing you gotta do is get real hardware that represents your users Set the right benchmarks for the people you serve Your technology choices must be informed on that or you re just posturing If you re targeting cheap phones though I can tell you what I d look at today For the closest performance to my demo try marko serve Yes I m paid to work on Marko now but what technology would better match my demo s speed than the same technology But it s gauche to only recommend my employer s thing What else what else…If your site doesn t need JS to work then absolutely go for a static site But for something with even sprinkles of interactivity like e commerce ーwell there s a reason my demo didn t run JAMstack My checklist of requirements are…Streaming HTML See part for why Minimum framework JS ーat least half of react react dom The ability to only hydrate some components so your users only download JavaScript that actually provides dynamic functionality Can render in CDN edge servers This unfortunately is hard to do for languages other than JavaScript unless you do something like Fly io s One Weird Trick Svelte doesn t have partial hydration but it tackles the too much app JS problem in a different way with its idiosyncratic “𝑋components vs 𝑌bundle kB curve If Svelte implemented streaming HTML then I d recommend it in a heartbeat Maybe someday If Preact had partial hydration and streaming I d recommend it too even though Preact s goals don t always match mine I can t argue with Jason Miller s consistent results Preact probably will have equivalents of React s streaming and Server Components right Remix is almost a recommend its philosophies are ‍ Its progressive enhancement approach is exactly what I want as of React it can stream HTML and they re doing invaluable work successfully convincing React devs that those things are important This kind of stuff has me shaking my fists in agreement Wouldn t it be great if we could just move all of that code out of the browser and onto the server Isn t it annoying to have to write a serverless function any time you need to talk to a database or hit an API that needs your private key yes it is These are the sorts of things React Server Components promise to do for us and we can definitely look forward to that for data loading but they don t do anything for mutations and it d be cool to move that code out of the browser as well ー Remix The Yang to React s YinWe ve learned that fetching in components is the quickest way to the slowest UX not to mention all the content layout shift that usually follows It s not just the UX that suffers either The developer experience gets complex with all the context plumbing global state management solutions that are often little more than a client side cache of server side state and every component with data needing to own its own loading error and success states ー Remixing React RouterReally the only thing I don t like about Remix is…React Check this perf trace Sample from this remix ecommerce fly dev WebPageTest traceSure the main thread s only blocked for seconds total but I don t want do that that to users on every page navigation That s a good argument for why Remix progressively enhances to client side navigation…but I ve already made my case on that Ideally Remix would let you use other frameworks and I d shove Marko in there They ve discussed the possibility so who knows 2022-06-08 14:20:26
海外TECH DEV Community A stage for our open-source contributors https://dev.to/scopsy/a-stage-for-our-open-source-contributors-1f52 A stage for our open source contributorsWhen we first showed the world Novu an open source notification infrastructure we were amazed by the incredible adoption by our community members and how the project was shaped by the dozen of individual contributors from around the world Over the past months more than commits were made by more than different contributors who sparked incredible ideas and helped with almost any aspect of Novu coIn the last couple of weeks we thought of a way for us to showcase these incredible individuals who are bringing Novu to life Excited to share with you the Novu Community Heroes program to highlight our contributors and showcase their work to the world We created a unique page for every single contributor highlighting their Latest pull request and contributionsTheir public GitHub profile bioSocial Media LinkContribution BadgesYou can find the link here We encourage every open source library to promote its contributors in every possible way This project was not easy to make as we needed a way to automate all our contributor s actions So we have created this public repository if you are curios on have it was implemented We have been using Orbit love and GitHub to sync our contributor s work and grant them badged and medals once a PR was approved How to earn a Novu Contributor Badge Head over to our Novu s issue page find an issue that you might want to help with and get a PR ready need any help you can reach us on Discord The future of Novu Contributors ProjectIt s only the beginning of the project We have an excellent roadmap of more things we want to feature such asDiscord ContributionsGithub DiscussionsGithub IssuesContent creatorsDo you think about something specific we should add Happy to hear it in the comments 2022-06-08 14:06:20
Apple AppleInsider - Frontpage News The five best apps to create healthy habits https://appleinsider.com/articles/22/06/08/the-five-best-apps-to-create-healthy-habits?utm_medium=rss The five best apps to create healthy habitsHabit tracking apps on iOS and iPadOS can be specialized or versatile and can involve health or custom habits Here are some of the best options to create a routine Woman writing in journalA good choice for tracking multiple habits is certainly one of the versatile apps WaterMinder can track liquids but so can HabitMinder in addition to other habits Price is something to keep in mind especially for comparing a subscription based app with a one time purchase Read more 2022-06-08 14:56:57
Apple AppleInsider - Frontpage News Deals: Apple Watch Series 7 slashed to $300 at Amazon, a record low price https://appleinsider.com/articles/22/06/08/deals-apple-watch-series-7-slashed-to-300-at-amazon-a-record-low-price?utm_medium=rss Deals Apple Watch Series slashed to at Amazon a record low priceAmazon s discount on the mm Apple Watch Series drives the price down to an all time low Apple Watch Series is priced at an all time low of at AmazonAt press time the mm Apple Watch Series GPS is off when ordered with a Green Aluminum Case and Clover Sport Band With Father s Day around the corner the price drop at Amazon couldn t come at a better time with the Apple Watch Series occupying a spot in our Father s Day Gift Guide Read more 2022-06-08 14:31:20
Apple AppleInsider - Frontpage News Apple's MacBook Pro 16-inch is back in stock for $2,299 ($200 off), plus $80 off AppleCare https://appleinsider.com/articles/22/05/25/apples-macbook-pro-16-inch-is-back-in-stock-for-2299-200-off-plus-80-off-applecare?utm_medium=rss Apple x s MacBook Pro inch is back in stock for off plus off AppleCareThis exclusive inch MacBook Pro deal not only delivers the cheapest price available but it s in stock at Apple Reseller Adorama avoiding Apple s month backorder delay Apple s hard to find inch MacBook Pro is and in stock right nowApple s standard MacBook Pro inch is in stock now in the Silver finish putting the popular system in your hands much faster than Apple itself which is reporting a late July to mid August ETA Read more 2022-06-08 14:43:04
海外TECH Engadget The best camera and photography gift ideas for dad https://www.engadget.com/best-camera-photography-gift-ideas-for-dad-150001904.html?src=rss The best camera and photography gift ideas for dadWhen your dad decides to take his photography game to a new level a smartphone may no longer be enough Some may want a sports camera to capture their adventures while others may need a mirrorless camera for better family photos films or artistic shots Thanks to the rapidly advancing technology they keep getting better with faster shooting speeds sharper video and incredible autofocus We found five of the best models for budgets ranging from to along with some accessories to complement the gear your old man already owns GoPro Hero BlackGoProIf your dad would rather star in his own sports adventures than watch them on TV the Hero Black is the camera he needs It has all of the stellar features its predecessor did plus a new GP processor that brings faster performance and a boost in frame rates We were impressed by its speedy user interface the improved image quality and the new “hydrophobic lens coating that makes the camera a bit more water resistant than previous models We would still recommend dad being careful with it though Best off it can be yours for with one year GoPro subscription ーa discount of off the regular price without a subscription Buy Hero Black bundle at GoPro Canon EOS MCanonSo your dad is taking up photography An entry level camera is a good way to start out and the best one out there is Canon s EOS M With a megapixel sensor and Canon s skin friendly colors it delivers great photos They re also easy to capture thanks to an intuitive smartphone like interface fast autofocus speeds and great eye detection performance He ll also be able to shoot K p video albeit with a times crop along with full sensor p at fps And it s available for significantly less than most other mirrorless cameras at complete with an EF M mm kit lens Buy Canon EOS M at B amp H Photo Sony ASonySony cameras generally make great gifts and the best value right now is the A It features class leading autofocus and eye tracking performance for humans and animals ensuring your sharp shots even with fast moving subjects Sony has also improved the color science and low light capabilities so family photos will be sharp and color accurate even in dimly lit environments The drawbacks are bad rolling shutter that can cause video wobble and a low resolution electronic viewfinder Still for body only the A is the best mirrorless camera in its price range Buy Sony A at B amp H Photo Fujifilm X TFujifilmFujifilm s X T is the best crop sensor camera on the market making it a desirable gift for any lucky father It s notably improved over the X T with the addition of in body stabilization and a fully articulating screen At the same time it has the best video features for an APS C camera with sharp K video at up to fps along with p at fps Both photo and video quality are outstanding with great skin tones second only to Canon s models But the autofocus with tracking and eye detection is good but not quite up to Sony s standards And while the generous manual controls deliver great handling it s less compact than before It s not cheap at but it can hold its own against far more expensive full frame cameras Buy Fujifilm X T at B amp H Photo Canon EOS RCanonFor dads who can t decide between photos and video Canon s EOS R does both things well The megapixel sensor lacks resolution compared to rivals but it offers killer specs like in body stabilization and Canon s fast and accurate Dual Pixel autofocus for video and photos along with sharp K video at up to fps Other features include a flip out display relatively compact size and skin tones that will flatter your dad s subjects you possibly It does suffer from overheating issues with video but that s only likely to affect pros who shoot for long stretches at a time Overall it s currently our best pick for under Buy Canon EOS R at Amazon DJI OMEngadgetSmartphone stabilizers are fine but nothing tops a gimbal for tracking shots The best deal out there for mobile devices is DJI s OM ideal for your dad if he s tired of jerky tracking shots This model rocks a magnetic mount system that makes attaching your phone faster and easier plus a smaller design with a built in extension rod It also has features like “dynamic zoom and “spin shot that will give your dad a new repertoire of moves As with other DJI gimbals it delivers smooth reliable performance and has a solid app that s easy to use It s also relatively affordable You can grab one now with a grip and tripod for Buy DJI OM at Amazon Peak Design Everyday MessengerPeakWith its rugged practical design Peak Design s Everyday Messenger Bag is an ideal gift for adventurous or photo shooting dads It s built with a lightweight yet durable percent waterproof recycled D shell with the ingenious Flexfold dividers in the main storage area It also offers a pair of zipped pockets two elastic side pockets and a compartment big enough for a to inch laptop I own one myself and find it practical both for work and daily activities letting me fit a camera lens and laptop along with my wallet and keys At it s not the cheapest bag out there but your dad won t have to buy another for a good while Buy Everyday Messenger at Peak Design Magnus VT tripodMagnusFor dads serious about video the Magnus VT is the best budget tripod option It s stout enough to handle a mirrorless camera and accessories weighing up to pounds more than the eight pound weight of the tripod itself That lack of heft makes it practical for travel while the fluid head helps you tilt and pan smoothly Other features include a middle spreader to keep things steady and legs that extend up to inches so you can match the eyeline of your subjects All of these features come for a relative steal considering the quality Buy Magnus VT at B amp H Photo Joby GorillaPod K mini tripodJobyThe most useful accessories out there for vlogging dads are Joby s famous mini tripods and the best one for the money is the GorillaPod K Attaching your camera couldn t be easier thanks to the secure clip in mounting plate with a built in level The flexible also lets let you set your camera anywhere to shoot or even wrap it around a tree or other object And of course you can bend them out for the ideal vlogging angle and steady out your shooting to boot It s at Amazon right now a bargain for such a versatile tool Buy Joby GorillaPod K at Amazon SanDisk Extreme Pro SD cardSanDiskCamera loving dads can never get enough memory cards but they can be a pretty pricey gift One of the best budget options is SanDisk s ExtremePro UHS I SD cards While they don t offer the top MB s speeds of UHS II cards they re far cheaper and the MB s read write speeds are fast enough for most types of photography and video What s more you can transfer files at speeds up to MB s with a compatible reader and SanDisk is known for producing reliable cards SanDisk has models for all budgets with the GB version in the sweet spot at If that s too much the GB version is and the GB model a mere Buy SanDisk Extreme Pro GB card at Amazon 2022-06-08 14:50:15
海外TECH Engadget Senators introduce bipartisan bill to regulate crypto assets https://www.engadget.com/senate-crypto-regulation-bill-responsible-financial-innovation-act-143136481.html?src=rss Senators introduce bipartisan bill to regulate crypto assetsPoliticians are quickly seizing on US government efforts to study and regulate crypto Reutersreports Senators Kirsten Gillibrand D NY and Cynthia Lummis R WY have introduced a bill the Responsible Financial Innovation Act that would forge a quot complete regulatory framework quot for cryptocurrency and other digital assets The measure is meant to protect consumers and fold crypto into existing laws without restricting technical progress RIFA would set clearer definitions such as establishing which assets are commodities or securities It would also create requirements for stablecoins cryptocurrencies pegged to another asset such as conventional money to minimize risks and enable speeder payments The Commodity Futures Trading Commission CFTC would have the power to regulate digital spot markets while providers would be subject to disclosure requirements There would be a quot workable quot tax structure that would let you buy products with cryptocurrency without having to account for and report income The act would also prompt the government to further research digital assets It would create a quot sandbox quot where federal and state regulators could work together on experimental launches of financial technology The CFTC and Securities Exchange Commission would have to develop both security guidance and a self regulatory organization Other government agencies and offices would be tasked with studying energy consumption the benefits and dangers of investing retirement savings in crypto and the security concerns around China s official digital currency The bipartisan nature of the bill could increase its chances of surviving a Senate vote Reuters also points out that the CFTC is considered friendlier to crypto assets than the SEC That s potentially useful for winning over regulation averse politicians worried the SEC might limit crypto s growth A House equivalent has yet to exist and it s unlikely that RIFA would reach President Biden s desk before the current session of Congress ends It s likewise unclear just which digital assets are covered and whether or not NFTs might be affected We ve asked for more details The bill nonetheless represents the strongest effort yet to regulate crypto and might just serve as a blueprint for future efforts to control and legitimize the blockchain in the US 2022-06-08 14:31:36
海外科学 NYT > Science How ‘Trustless’ Is Bitcoin, Really? https://www.nytimes.com/2022/06/06/science/bitcoin-nakamoto-blackburn-crypto.html different 2022-06-08 14:34:49
金融 RSS FILE - 日本証券業協会 株主コミュニティの統計情報・取扱状況 https://www.jsda.or.jp/shiryoshitsu/toukei/kabucommunity/index.html 株主コミュニティ 2022-06-08 15:30:00
金融 金融庁ホームページ CrossBridge Venture Partners LLCに対する行政処分について公表しました。 https://www.fsa.go.jp/news/r3/shouken/20220608.html bridgeventurepartnersllc 2022-06-08 15:00:00
ニュース BBC News - Home Petrol prices see biggest daily jump in 17 years https://www.bbc.co.uk/news/business-61731161?at_medium=RSS&at_campaign=KARANGA average 2022-06-08 14:56:23
ニュース BBC News - Home Weinstein to be charged with indecent assault in UK https://www.bbc.co.uk/news/uk-61736763?at_medium=RSS&at_campaign=KARANGA hollywood 2022-06-08 14:32:29
ニュース BBC News - Home Boris Johnson vows to get on with being prime minister https://www.bbc.co.uk/news/uk-politics-61729976?at_medium=RSS&at_campaign=KARANGA minister 2022-06-08 14:18:15
ニュース BBC News - Home UK to see slowest growth of developed countries, says OECD https://www.bbc.co.uk/news/business-61729892?at_medium=RSS&at_campaign=KARANGA italy 2022-06-08 14:54:28
ニュース BBC News - Home Birmingham 2022 Commonwealth Games: Talks with Army as 5,000 jobs vacant https://www.bbc.co.uk/news/uk-england-birmingham-61721577?at_medium=RSS&at_campaign=KARANGA birmingham 2022-06-08 14:04:47
ニュース BBC News - Home British Virgin Islands: UK decides against direct rule of territory https://www.bbc.co.uk/news/uk-61736373?at_medium=RSS&at_campaign=KARANGA failures 2022-06-08 14:29:37
ニュース BBC News - Home Has Boris Johnson met his pledges? https://www.bbc.co.uk/news/uk-politics-58401767?at_medium=RSS&at_campaign=KARANGA immigration 2022-06-08 14:01:35
ニュース BBC News - Home Ollie Robinson: England pace bowler set for longer spell out https://www.bbc.co.uk/sport/cricket/61687178?at_medium=RSS&at_campaign=KARANGA august 2022-06-08 14:06:18
ニュース BBC News - Home Tiffany Youngs, wife of Tom, dies after suffering from cancer https://www.bbc.co.uk/sport/rugby-union/61735100?at_medium=RSS&at_campaign=KARANGA england 2022-06-08 14:37:37
ニュース BBC News - Home Isle of Man TT: Organisers confirm Cesar Chanal died not Olivier Lavorel https://www.bbc.co.uk/sport/motorsport/61733290?at_medium=RSS&at_campaign=KARANGA lavorel 2022-06-08 14:43:24
北海道 北海道新聞 <ココカラ>釧路市中心部の再開発、にぎわい戻る? 「街中テナント募集中だらけ―」の投稿に8万いいね 解決策考える「ツアー」も https://www.hokkaido-np.co.jp/article/691232/ 釧路市 2022-06-08 23:11:42
北海道 北海道新聞 首相、20カ国で海上保安協力 自由な太平洋、来春計画案 https://www.hokkaido-np.co.jp/article/691247/ 安全保障 2022-06-08 23:08:00
北海道 北海道新聞 上川管内121人感染 新型コロナ https://www.hokkaido-np.co.jp/article/691022/ 上川管内 2022-06-08 23:04:07
北海道 北海道新聞 道南観光業者、訪日客増に期待 10日、入国再開 1日上限2万人 https://www.hokkaido-np.co.jp/article/691245/ 受け入れ 2022-06-08 23:01:00
北海道 北海道新聞 オミクロン対応ワクチン「有望」 モデルナ発表、秋接種想定 https://www.hokkaido-np.co.jp/article/691244/ 新型コロナウイルス 2022-06-08 23:01:00

コメント

このブログの人気の投稿

投稿時間:2021-06-17 22:08:45 RSSフィード2021-06-17 22:00 分まとめ(2089件)

投稿時間:2021-06-20 02:06:12 RSSフィード2021-06-20 02:00 分まとめ(3871件)

投稿時間:2021-06-17 05:05:34 RSSフィード2021-06-17 05:00 分まとめ(1274件)