IT |
気になる、記になる… |
Beats、Unionとコラボした「Beats Studio Buds − リミテッドエディション」を本日より限定発売 |
https://taisy0.com/2021/12/01/149216.html
|
beats |
2021-11-30 17:03:50 |
AWS |
AWS News Blog |
Amazon Kinesis Data Streams On-Demand – Stream Data at Scale Without Managing Capacity |
https://aws.amazon.com/blogs/aws/amazon-kinesis-data-streams-on-demand-stream-data-at-scale-without-managing-capacity/
|
Amazon Kinesis Data Streams On Demand Stream Data at Scale Without Managing CapacityToday we are launching Amazon Kinesis Data Streams On demand a new capacity mode This capacity mode eliminates capacity provisioning and management for streaming workloads Kinesis Data Streams is a fully managed serverless service for real time processing of streamed data at a massive scale Kinesis Data Streams can take any amount of data from any number of … |
2021-11-30 17:46:12 |
AWS |
AWS News Blog |
Introducing Amazon Redshift Serverless – Run Analytics At Any Scale Without Having to Manage Data Warehouse Infrastructure |
https://aws.amazon.com/blogs/aws/introducing-amazon-redshift-serverless-run-analytics-at-any-scale-without-having-to-manage-infrastructure/
|
Introducing Amazon Redshift Serverless Run Analytics At Any Scale Without Having to Manage Data Warehouse InfrastructureWe re seeing the use of data analytics expanding among new audiences within organizations for example with users like developers and line of business analysts who don t have the expertise or the time to manage a traditional data warehouse Also some customers have variable workloads with unpredictable spikes and it can be very difficult for them … |
2021-11-30 17:44:46 |
AWS |
AWS News Blog |
AWS Lake Formation – General Availability of Cell-Level Security and Governed Tables with Automatic Compaction |
https://aws.amazon.com/blogs/aws/aws-lake-formation-general-availability-of-cell-level-security-and-governed-tables-with-automatic-compaction/
|
AWS Lake Formation General Availability of Cell Level Security and Governed Tables with Automatic CompactionA data lake can help you break down data silos and combine different types of analytics into a centralized repository You can store all of your structured and unstructured data in this repository However setting up and managing data lakes involve a lot of manual complicated and time consuming tasks AWS Lake Formation makes it easy … |
2021-11-30 17:38:21 |
AWS |
AWS Big Data Blog |
Announcing Amazon EMR Serverless (Preview): Run big data applications without managing servers |
https://aws.amazon.com/blogs/big-data/announcing-amazon-emr-serverless-preview-run-big-data-applications-without-managing-servers/
|
Announcing Amazon EMR Serverless Preview Run big data applications without managing serversToday we re happy to announce Amazon EMR Serverless a new option in Amazon EMR that makes it easy and cost effective for data engineers and analysts to run petabyte scale data analytics in the cloud With EMR Serverless you can run applications built using open source frameworks such as Apache Spark Hive and Presto without having to configure … |
2021-11-30 17:44:26 |
AWS |
AWS Management Tools Blog |
Use AWS Flexible Licensing options to optimize cost |
https://aws.amazon.com/blogs/mt/use-aws-flexible-licensing-options-to-optimize-cost/
|
Use AWS Flexible Licensing options to optimize costLicense Flexibility Many organizations have an existing investment in Microsoft licenses This includes licenses for the Windows Server Operating System nbsp and Microsoft SQL Server Customers who have decided to migrate to AWS often want to leverage their existing investment in Microsoft licenses to reduce the costs associated with the move However many customers also need … |
2021-11-30 17:27:47 |
AWS |
AWS Networking and Content Delivery |
Building Multi-Region AWS Client VPN with Microsoft Active Directory and Amazon Route 53 |
https://aws.amazon.com/blogs/networking-and-content-delivery/building-multi-region-aws-client-vpn-with-microsoft-active-directory-and-amazon-route-53/
|
Building Multi Region AWS Client VPN with Microsoft Active Directory and Amazon Route Introduction Organizations often require a secure connection between their users and resources on internal networks For organizations with a global workforce traditional virtual private network VPN solutions can be difficult to scale Providing a single VPN endpoint creates a single point of failure an outage would mean loss of connectivity to critical IT infrastructure Authenticating … |
2021-11-30 17:32:27 |
AWS |
AWS Government, Education, and Nonprofits Blog |
Announcing the winners of the 2021-2022 AWS Imagine Grant |
https://aws.amazon.com/blogs/publicsector/announcing-winners-of-2021-2022-aws-imagine-grant/
|
Announcing the winners of the AWS Imagine GrantThis year s cohort of nonprofit winners of the Imagine Grant represents the biggest group of winners out of a record number of applications each exhibiting a commitment to innovation and an insistence on the highest standards for the mission areas they serve For the first time this year s Imagine Grant offered two distinct award categories inviting nonprofits of all sizes to put forward both advanced technical projects in the Go Further Faster award category and foundational IT projects in the Momentum to Modernize award category |
2021-11-30 17:41:13 |
AWS |
AWS Startups Blog |
Too Good To Go: Saving the Planet One Saved Meal at a Time |
https://aws.amazon.com/blogs/startups/too-good-to-go-saving-the-planet-one-saved-meal-at-a-time/
|
Too Good To Go Saving the Planet One Saved Meal at a TimeFood waste is a global issue that stretches far beyond its impact on underfed populations it s one of the primary drivers behind the climate crisis representing of all greenhouse gas emissions Combating food waste reduces methane emissions in particular while also preventing the waste of the labor and resources required to make the food But there s room for hope and Too Good To Go is helping make it happen with their free app dedicated to reducing food waste worldwide by connecting customers to restaurants and stores with surplus food |
2021-11-30 17:37:07 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
CorigEngineで3Dモデルでダイアログメッセージを表示したい |
https://teratail.com/questions/371730?rss=all
|
CorigEngineでDモデルでダイアログメッセージを表示したいUnityの有料アセットであるCorigEngineについての質問です。 |
2021-12-01 02:51:09 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
Opencv3を用いて画像のラベリングを行い、最大面積のものを画像から消去するプログラム(Java) |
https://teratail.com/questions/371729?rss=all
|
Opencvを用いて画像のラベリングを行い、最大面積のものを画像から消去するプログラムJava前提・実現したいことOpencvを用いて画像のラベリングを行い、最大面積のものを画像から消去する塗りつぶすプログラムを作っています。 |
2021-12-01 02:48:02 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
SQliteのインストールでエラー |
https://teratail.com/questions/371728?rss=all
|
SQliteのインストールでエラー経緯railsnbspsnbspコマンドに対しYournbspversionnbspofnbspSQLitenbspnbspisnbsptoonbspoldnbspActivenbspRecordnbspsupportsnbspSQLitenbspgtnbspとのエラーが出ました。 |
2021-12-01 02:39:47 |
Linux |
Ubuntuタグが付けられた新着投稿 - Qiita |
Ubuntuでマイクのノイズを軽減する時にハマったこと |
https://qiita.com/hisw/items/3d7c382fe03391a93fa7
|
詳細はリンク先を見て頂ければと思いますが、こちらの方法で繋いでいるマイクの方が適用になるようにしたところ、無事BuiltinAudioAnalogStereoechocancelledが表示され、Teamsのテスト通話でノイズキャンセルされたマイク入力の確認ができました余談ここまでやってみてですが、ノイズは消えたものの音がこもってるような感じもするので、音質自体はデバイスの問題もあるかもしれないです。 |
2021-12-01 02:01:15 |
技術ブログ |
Developers.IO |
[速報] AWS Graviton3プロセッサを搭載したEC2 C7gインスタンスを発表! #reinvent |
https://dev.classmethod.jp/articles/reinvent2021-amazon-ec2-c7g-and-graviton3/
|
awsgraviton |
2021-11-30 17:37:31 |
技術ブログ |
Developers.IO |
[アップデート]IPv6オンリーのリソースからNAT64/DNS64を使ってIPv4のリソースに接続出来るようになりました! |
https://dev.classmethod.jp/articles/ipv6-subnet-ec2-nat64-dns64/
|
natdns |
2021-11-30 17:07:28 |
海外TECH |
Ars Technica |
Microsoft plans to integrate a “buy now, pay later” app into Edge |
https://arstechnica.com/?p=1816919
|
microsoft |
2021-11-30 17:36:57 |
海外TECH |
Ars Technica |
UK orders Facebook to sell Giphy, rejects Meta’s proposed merger conditions |
https://arstechnica.com/?p=1816906
|
problem |
2021-11-30 17:19:20 |
海外TECH |
Ars Technica |
Holmes recounts sexual, emotional abuse by Theranos exec Balwani |
https://arstechnica.com/?p=1816904
|
holmes |
2021-11-30 17:01:32 |
海外TECH |
MakeUseOf |
Working With Nodes in DaVinci Resolve: A Beginner's Guide |
https://www.makeuseof.com/nodes-davinci-resolve-beginners-guide-how-to-use/
|
davinci |
2021-11-30 17:30:11 |
海外TECH |
MakeUseOf |
Custodial vs. Non-Custodial Crypto Wallets: What's the Difference? |
https://www.makeuseof.com/custodial-vs-non-custodial-crypto-wallets/
|
wallet |
2021-11-30 17:15:12 |
海外TECH |
DEV Community |
Building a metrics dashboard with Superset and Cube |
https://dev.to/cubejs/building-a-metrics-dashboard-with-superset-and-cube-1hc0
|
Building a metrics dashboard with Superset and CubeTL DR In this tutorial we ll learn how to build a metrics dashboard with Apache Superset a modern and open source data exploration and visualization platform We ll also use Cube an open source metrics store as the data source for Superset that will enable our dashboards to load in under a second ーquite the opposite to what you d usually expect from a BI tool right Here s how the end result will look like Now we re all set Let s see what s on the shelves of this metrics store What is Apache Superset Apache Superset is a data exploration and visualization platform or in layman s terms a tool that you can use to build dashboards with charts for internal users Born at a hackathon at Airbnb back in with more than stars on GitHub now it s a leading open source business intelligence tool Superset has connectors for numerous databases from Amazon Athena to Databricks to Google BigQuery to Postgres It provides a web based SQL IDE and no code tools for building charts and dashboards Running Superset Now let s run Superset to explore these features To keep things simple we ll run a fully managed Superset in Preset Cloud where you can use it forever for free on the Starter plan If you d like to run Superset locally with Docker please see these instructions First please proceed to the sign up page and fill in your details Note that Preset Cloud supports signing up with your Google account Within a few seconds you will be taken to your account with a readily available workspace Switching to that workspace will reveal a few example dashboards that you can review later Now let s navigate to Data Databases via the top menu and Oops We need a metrics store to connect to Let s see how Cube can help us build one What is Cube Cube is an open source metrics store with nearly stars on GitHub to date It serves as a single source of truth for all metrics and provides APIs for powering BI tools and building data apps You can configure Cube to connect to any database define your metrics via a declarative data schema and instantly get an API that you can use with Superset or many other BI tools Running Cube Similarly to Superset let s run a fully managed Cube in Cube Cloud that has a free plan as well If you d like to run Cube locally with Docker please see these instructions First please proceed to the sign up page and fill in your details Note that Cube Cloud supports signing up with your GitHub account Within a few seconds you will be taken to your account where you can create your first Cube deployment Proceed with providing a name for your deployment selecting a cloud provider and a region At the next step choose Create to start a new Cube project from scratch Then pick Postgres to proceed to the screen where you can enter the following credentials Hostname demo db examples cube devPort Database ecomUsername cubePassword Cube will connect to a publicly available Postgres database that I ve already set up The last part of configuration is the data schema which declaratively describes the metrics we ll be putting on the dashboard Actually Cube can generate it for us Pick the top level public database from the list In a while your Cube deployment will be up and running Defining metrics Please navigate to the Schema tab You will see files like LineItems js Orders js Users js etc under the schema folder Let s review LineItems js which defines the metrics within the LineItems cube This file is different from the one in Cube Cloud but we ll take care of that later cube LineItems sql SELECT FROM public line items measures count type count price sql price type sum quantity sql quantity type sum A calculated measure that reference other measures See calculated measures avgPrice sql CUBE price CUBE quantity type number A rolling window measure See rolling window revenue sql price type sum rollingWindow trailing unbounded dimensions id sql id type number primaryKey true createdAt sql created at type time dataSource default Key learnings here the cube is a logical entity that groups measures and dimensions togetherusing the sql statement this cube is defined over the entire public line items table actually cube can be defined over an arbitrary SQL statement that selects datameasures quantitative data are defined as aggregations e g count sum etc over columns in the datasetdimensions qualitative data are defined over textual numeric or temporal columns in the datasetyou can define complex measures and dimensions with custom sql statements or references to other measuresDevelopment mode Now let s update the schema file in Cube Cloud to match the contents above First click Enter Development Mode to unlock the schema files for editing This essentially creates a fork of the Cube API that tracks your changes in the data schema Navigate to LineItems js and replace its contents with the code above Then save your changes by clicking Save All to apply the changes to the development version of your API You can apply as many changes as you wish but we re done for now Click Commit amp Push to merge your changes back to the main branch On the Overview tab you will see your changes deployed Now you can explore the metrics on the Playground tab Good We ve built a metrics store that we can connect to Superset How Please go back to the Overview tab and click How to connect The SQL API tab will have a toggle that enables the API for Superset and other BI tools Turning it on will provide you with all necessary credentials Now let s build a dashboard Building a dashboard in SupersetWe ll need to go through a few steps connect Superset to Cubedefine the datasetscreate charts and add them to a dashboardLet s go Connect Superset to Cube Switch back to the workspace we ve created earlier Then navigate to Data Databases via the top menu click Database select MySQL and fill in the credentials from your Cube Cloud instance ーor use the credentials below Host aquamarine moth sql aws us east cubecloudapp devPort Database name dbUsername cube aquamarine mothPassword fdaeffaabfbbfcdDisplay name Cube Cloud it s important You can press Connect now Define the datasets Navigate to Data Datasets via the top menu click Dataset and fill in the following credentials Database Cube Cloud the one we ve just created Schema dbSee table schema LineItemsYou can press Add now Then please repeat this for Users and Orders Create charts and a dashboard We ll take a leap and create everything in a single step In Superset you can export a dashboard with all charts as a JSON file and import it later Navigate to Dashboards and click the link with an icon on the right Download this file to your machine select it and click Import Whoa Now we have a complete dashboard for Acme Inc Click on it to view Looks nice doesn t it Let s explore what s under the hood and how you can build it on your own Diving deep into SupersetAnatomy of a dashboard You can see that charts on the dashboard are aligned by the grid To rearrange them click on the pencil icon in the top right corner You can add tabs headers dividers Markdown blocks etc Of course you can also add charts The simplest chart Navigate to Charts via the top menu and click on any chart with the Big Number visualization type e g Customers In my opinion that s the simplest chart you can create in Superset It contains a single metric and I doubt a chart can get simpler than that Let s dissect how a chart is defined Visualization type Big Number ーthat s where you can select or change the chart typeTime column createdAt ーinterestingly enough any chart should have this time column defined even if the displayed data has no temporal componentsMetric COUNT ーthat s the most important part of any chart configuration upon clicking on this metric you ll see that you can either select a saved definition simply select a column and an aggregation or write a custom SQL expressionWhen all config options are set press Run to fetch the data then Save to persist a chart or add it on a dashboard no need to do it now it s already added A less simple chart Navigate to Charts via the top menu and click on any chart with the Big Number with Trendline visualization type e g Revenue Still it contains a single metric as well as a sketchy chart in the bottom Let s dissect how this chart is defined only new options Time grain Month ーdefines the temporal granularity for metrics calculationsTime range Last year ーspecifies the date range for this chartMetric revenue ーit s interesting click on this metric to learn that it s defined using custom SQL that s because the aggregation has already been performed by Cube no need to aggregate aggregated values right Other charts Actually now you know everything you need to explore and dissect other charts Just keep in mind that Superset has plenty of customizations you can apply to charts ーsee the Customize tab for inspiration Viewing SQL For any chart you can reveal the SQL query to Cube which is generated by Superset to fetch the data Press the burger button with triple horizontal lines in the top right corner then View query Good ol SQL nice Also if you wanna use the aforementioned SQL IDE navigate to SQL Lab SQL Editor via the top menu Making business intelligence fast There s only one thing left to explore but it s a huge one Let s navigate back to the Acme Inc dashboard It takes seconds to load and the infinity shaped spinners are clearly visible They are not annoying but honestly ーwouldn t you like this dashboard to load instantly Yep well under a second Cube provies an out of the box caching layer that allows to pre compute and materialize the data required to serve the queries All you need to do is define which queries should be accelerated It s done declaratively in the data schema files Please also note that Superset has its own lightweight caching layer that might be handy in cases when you need to push your Cube Superset to the limit Please go back to your Cube Cloud instance enter the development mode switch to the Schema tab and update your data schema files with small snippets as follows First LineItems js should look like this cube LineItems sql SELECT FROM public line items Copy me ↓ preAggregations main measures CUBE count CUBE revenue CUBE price CUBE quantity timeDimension CUBE createdAt granularity day Copy me ↑ measures count type count Second Orders js should look like this cube Orders sql SELECT FROM public orders Copy me ↓ preAggregations main measures CUBE count dimensions CUBE status timeDimension CUBE createdAt granularity day Copy me ↑ measures count type count Lastly Users js should look like this cube Users sql SELECT FROM public users Copy me ↓ preAggregations main measures CUBE count dimensions CUBE city CUBE gender Copy me ↑ measures count type count Don t forget to click Save All then Commit amp Push and check that your changes were deployed at the Overview tab In the background Cube will build the necessary caches It s time to get back to your dashboard and refresh it Now refresh it one more time See The dashboard loads in under a second Of course you have plenty of options to fine tune the caching behavior e g specify the cache rebuilding schedule Wrapping upThanks for following this tutorial I encourage you to spend some time in the docs and explore other features of Apache Superset Also please check out Preset docs that are packed with great content e g on creating charts Also thanks for learning more about building a metrics store with Cube Indeed it s a very convenient tool to serve as a single source of truth for all metrics Please don t hesitate to like and bookmark this post write a comment and give a star to Cube and Superset on GitHub I hope these tools would be a part of your toolkit when you decide to build a metrics store and a business intelligence application on top of it Good luck and have fun |
2021-11-30 17:46:32 |
海外TECH |
DEV Community |
How to Add Preloader in HTML Page |
https://dev.to/softcodeon/how-to-add-preloader-in-html-page-1obn
|
How to Add Preloader in HTML PageA preloader is one of the important elements of the user friendly interface It indicates that contents are still loading with an animated icon or text In this tutorial you will come to know how to create a preloader on the HTML page Basically it is a full screen loading animation that covers the whole page until the page fully loaded The overlay and loader icon build with CSS no image and jQuery used for preloader function Well Let s get started with HTML structure to build a loading screen How to Add Preloader in HTML Page In order to display a loading screen animation before completely load the page you need to create two main HTML elements The first div element is the preloader that covers the whole page to hide the main content of the page Similarly the second div element is the preloader that contains the loader s related content So create a div element with an id name preloader Likewise create a div element with a class name container preloader id name container and place a child div inside it and define its class name animation preloader Inside the animation loader create a div element with the class name spinner In the end create a span with attribute preloader text S and add Alphabets one by once as you want to load like SoftDev “Loading… that you want to show on the loading screen You just need to write your brand name or company name with an alphabet inside span tag and add a class with name characters now all done see below the HTML code lt script src gt lt script gt lt Preloader gt lt div id preloader gt lt div id container class container preloader gt lt div class animation preloader gt lt div class spinner gt lt div gt lt div class txt loading gt lt span preloader text S class characters gt S lt span gt lt span preloader text O class characters gt O lt span gt lt span preloader text F class characters gt F lt span gt lt span preloader text T class characters gt T lt span gt lt span preloader text D class characters gt D lt span gt lt span preloader text E class characters gt E lt span gt lt span preloader text V class characters gt V lt span gt lt div gt lt div gt lt div class loader section section left gt lt div gt lt div class loader section section right gt lt div gt lt div gt lt div gt lt h class text gt We re Now lt span class open gt OPEN lt span gt lt h gt You can also add any other element like your site logo inside the “preloader div that you want to display on the loading screen Similarly if you want to show only the animated loader icon you can remove the complete div with the class name text loading CSS StylesAfter creating HTML structure for the preloader now it s time to style it using CSS For this purpose target the preloader element and make it a full width element by defining the width and height property as Likewise define value for the z index and set a background color according to your needs In my previous post Create a Calendar in HTML and CSS someone write to add comments I have added comments for easy to learn CSS Code lt style gt box sizing border box margin padding body background ddd height overflow x hidden text color brown font size px text align center open color green background padding px border radius px Preloader container preloader align items center cursor none display flex height justify content center position fixed left top width z index container preloader animation preloader position absolute z index Spinner Loading container preloader animation preloader spinner animation spinner s infinite linear border radius border px solid rgba border top color green It is not in alphabetical order so that you do not overwrite it height em margin auto em auto width em Loading text container preloader animation preloader txt loading font bold em Montserrat sans serif text align center user select none container preloader animation preloader txt loading characters before animation characters s infinite color orange content attr preloader text left opacity position absolute top transform rotateY deg container preloader animation preloader txt loading characters color rgba position relative container preloader animation preloader txt loading characters nth child before animation delay s container preloader animation preloader txt loading characters nth child before animation delay s container preloader animation preloader txt loading characters nth child before animation delay s container preloader animation preloader txt loading characters nth child before animation delay s container preloader animation preloader txt loading characters nth child before animation delay s container preloader animation preloader txt loading characters nth child before animation delay s container preloader loader section background color ffffff height position fixed top width calc px container preloader loader section section left left container preloader loader section section right right Fade effect on loading animation loaded animation preloader opacity transition s ease out Curtain effect loaded loader section section left transform translateX transition s s all cubic bezier loaded loader section section right transform translateX transition s s all cubic bezier Animation of the preloader keyframes spinner to transform rotateZ deg Animation of letters loading from the preloader keyframes characters opacity transform rotateY deg opacity transform rotateY deg Laptop size back laptop tablet cell phone media screen and max width px Preloader Spinner Loading container preloader animation preloader spinner height em width em Text Loading container preloader animation preloader txt loading font bold em Montserrat sans serif media screen and max width px Prelaoder Spinner Loading container preloader animation preloader spinner height em width em Loading text container preloader animation preloader txt loading font bold em Montserrat sans serif lt style gt At last include the jQuery JavaScript library and preloader function to fade away loader after window load You can set the custom duration in milliseconds for the delay and fade out animation lt script gt document ready function setTimeout function container addClass loaded Once the container has finished the scroll appears if container hasClass loaded It is so that once the container is gone the entire preloader section is deleted preloader delay queue function this remove lt script gt Want to read in detail from our official website How to Add Preloader in HTML Page To SEE DEMOThat s all Hopefully you have successfully implemented this preloader into your HTML web page If you have any questions or suggestions let me know by discuss below |
2021-11-30 17:27:59 |
海外TECH |
DEV Community |
Serverless e-commerce: Vendure on Google Cloud Run |
https://dev.to/martijnvdbrug/serverless-e-commerce-vendure-on-google-cloud-run-397d
|
Serverless e commerce Vendure on Google Cloud RunGoogle s Cloud Run is a scalable containerized fully managed serverless platform It s cheap and handles infrastructure and scaling for us Sounds perfect to run a headless Vendure instance on right Here is what we need to do Don t worry for most of these you can use a plugin Create a databaseUse Cloud Storage for assetsUse Cloud Tasks to process worker jobsDockerize VendureBuild image and deploy to Google Cloud RunWe assume you already have A Vendure project set up locally If not checkout these stepsgcloud cli installed and authorized Create a database for VendureVendure requires a database to store its products orders customers etc In this tutorial we will use Google s Cloud SQL a managed database platform Enable Cloud SQL in your dashboard and create a database For this example vcpu and GB SSD is sufficient PostgreSQL or MySQL is up to you Make sure to write down the password you entered Create a database named vendureClick on your instance and go to connection gt add network Create a new network with IP range Careful with production environments this will make your SQL instance publicly available Configure Vendure to use the new database vendure config tsdbConnectionOptions type mysql synchronize true Disable this after first startup logging false username root Don t use this in production password your password host The public IP of your SQL instance database vendure Setup Cloud StorageCloud run instances are stateless which also means you shouldn t use its local file system Instead we will use Google Cloud Storage Create a storage bucket and make it publicly accessible so that the images can be used in a storefront You can use this pluginto connect Vendure to the bucket vendure config tsAssetServerPlugin init storageStrategyFactory gt new GoogleStorageStrategy bucketName your bucket name route assets assetUploadDir tmp vendure assets Alternatively you can implement the AssetStorageStrategy yourself Setup Cloud TasksCloud run allows no processing outside request context As soon as your server returned a response there is no guarantee that any leftover processes will be finished Because of this we need some way to wrap the Vendure worker Jobs in a request We can do that with this plugin This plugin puts jobs in a queue and Cloud Tasks posts the messages back to a publicly available endpoint in your Vendure application vendure config tsCloudTasksPlugin init taskHandlerHost This endpoint needs to be accesible by Google Cloud Tasks projectId your projectId location europe west authSecret some secret For simplicity we will start the worker in the same instance as the application index tsbootstrap config then app gt app get JobQueueService start catch err gt console log err process exit This is not recommended for production Read more about the worker here Alternatively you could implement the JobQueueStrategy yourself Dockerize VendureCloud Run requires Vendure to be Dockerized We can simply create a Dockerfile in the root of the project FROM node WORKDIR usr src appCOPY RUN yarn install productionRUN yarn buildCMD node dist index js DeployFirst we need to build an image and push it to Google s container registry Install dockerExecute these commands in the root of your project docker build t eu gcr io your projectId vendure Configure docker to use Google authenticationgcloud auth configure docker qdocker push eu gcr io your projectId vendureNow all that s left is deploying the image to Google Cloud Run This sets all your secrets from a env file in a variable so we can pass it to Cloud Runexport ENV VARS paste sd env gcloud run deploy shops test quiet image eu gcr io your projectId vendure latest region europe west platform managed allow unauthenticated memory G project your projectId set env vars ENV VARS Go to to view your public URL It should look something like Make sure to also set this URL as taskHandlerHost in the Cloud Tasks plugin if you haven t already Go to and login with your super admin userGo to products and create a new productAdd an image to the productSave the product and go back to the overviewYour product should appear in the product overview in the adminCongratulations this means everything works as expected Make sure you set synchronize false after the database has been populated Some optional improvements Set up a separate worker instanceRestrict access to your database to specific IP s Use Unix sockets for database accessThats it |
2021-11-30 17:20:38 |
海外TECH |
DEV Community |
Solving Interview Problems with Deep Learning |
https://dev.to/mage_ai/solving-interview-problems-with-deep-learning-5cdg
|
Solving Interview Problems with Deep LearningPhoto credit Godzilla vs KongIt s been a while since I ve practiced programming interview questions and I worry that my skills are lacking It s always important to work through some every now and then to stay sharp so here we go Let s start with Fizz Buzz Write a program that prints the numbers from to But for multiples of print “Fizz instead of the number and for the multiples of print “Buzz For numbers which are multiples of both and print “FizzBuzz This solution was actually inspired by Joel Grus We can represent numbers with binary encoding import tensorflow as tfimport numpy as npINPUT SIZE We can encode up to numbers with binary encoding HIDDEN SIZE Hidden layer of size OUTPUT SIZE One hot encoding for the possible outputs Fizz Buzz FizzBuzz NUM EPOCHS def binary encode number data np zeros INPUT SIZE for bitshift in range INPUT SIZE data bitshift number gt gt bitshift amp Get the bit at position bitshift return datadef fizz buzz encode i if i return np array elif i return np array elif i return np array else return np array def fizz buzz decode encoding max idx np argmax encoding if max idx return if max idx return fizz if max idx return buzz if max idx return fizzbuzz Now we can generate some training data we will generate training data from to since our actual problem will solve fizzbuzz for X train np array binary encode i for i in range INPUT SIZE y train np array fizz buzz encode i for i in range INPUT SIZE And now let s define our multilayer perceptron model that will actually perform the bulk of the work First we define the inputs to the model X tf placeholder float None INPUT SIZE Y tf placeholder float None OUTPUT SIZE Next let s define the weights and biases that we will learn via backpropagation w tf get variable w INPUT SIZE HIDDEN SIZE initializer tf random normal initializer b tf get variable b HIDDEN SIZE w tf get variable w HIDDEN SIZE OUTPUT SIZE initializer tf random normal initializer b tf get variable b OUTPUT SIZE And now let s feed our input through the model z tf add tf matmul X w b a tf nn relu z z tf add tf matmul a w b Then let s compute the loss and tell tensorflow to minimize that loss cost tf nn softmax cross entropy with logits logits z labels Y optimizer tf train GradientDescentOptimizer minimize cost Now let s actually feed our real training data into that model init tf global variables initializer with tf Session as sess sess run init for i in range NUM EPOCHS c o sess run cost optimizer feed dict X X train Y y train print np sum c X test np array binary encode i for i in range y test np array fizz buzz encode i for i in range pred sess run z feed dict X X test Y y test And print out the results def real fizz buzz i txt if i txt fizz if i txt buzz return txtfor i in range len X test text fizz buzz decode pred i true text real fizz buzz i print format i text if text true text else x Results x fizz Total Accuracy For all that work we achieved an embarrassingly low accuracy I was going to try to fix this but then quickly lost interest Let s move onto something a bit more interesting Write a program that counts the number of unique characters in a string This sounds like a problem that can be solved with an LSTM Let s generate some data import tensorflow as tfimport randomimport numpy as npCHARS a b c d e f g STRING LENGTH num examples Args n Number of examples to generate Returns strings v numpy array of the form n STRING LENGTH len CHARS One hot encoding of sequences of text strings Array of actual generated random text uniques v numpy array of the form n len CHARS One hot encoding of number of unique characters uniques numpy array of length n number of unique characters for each sequence def generate data n num examples chars to idx c i for i c in enumerate CHARS strings v np zeros n STRING LENGTH len CHARS strings n uniques np zeros n uniques v np zeros n len CHARS for x in range n for y in range STRING LENGTH random shuffle CHARS char CHARS strings v x y chars to idx char strings x char uniques x len set strings x uniques v x len set strings x return strings v strings uniques v uniquesNext let s create our LSTM model HIDDEN LAYERS X tf placeholder float None STRING LENGTH len CHARS y tf placeholder float None len CHARS X seq tf unstack X STRING LENGTH lstm cell tf contrib rnn BasicLSTMCell HIDDEN LAYERS sequence of chars to output of outputs states tf contrib rnn static rnn lstm cell X seq dtype tf float final output outputs weights tf get variable weights HIDDEN LAYERS len CHARS initializer tf random normal initializer biases tf get variable biases len CHARS initializer tf random normal initializer prediction tf add tf matmul final output weights biases cost tf reduce mean tf nn softmax cross entropy with logits logits prediction labels y optimizer tf train AdamOptimizer e train op optimizer minimize cost Now let s go ahead and run our model init tf global variables initializer costs EPOCHS with tf Session as sess sess run init for i in range EPOCHS c sess run train op cost prediction feed dict X X train y y train costs append c if i print cost epoch format c i X test X test strings y test y test strings generate data p sess run prediction feed dict X X test y y test prediction idxs np argmax p axis prediction vals prediction idxs correct for i in range len y test strings string X test strings i actual val y test strings i predicted val prediction vals i Print the first examples if i lt print string pred actual format string predicted val actual val if predicted val actual val correct print accuracy n n format correct len y test strings cost epoch string eeaccbdeagac pred actual string gefaddbbcfac pred actual string acedcbgdcagf pred actual string aadebcdacefg pred actual string abeeaebcbbag pred actual accuracycost epoch string ebbfbfaebede pred actual string fgaccdbabedg pred actual string dgdcbaefcdad pred actual string ffagdfedccad pred actual string gfggfgbebcdb pred actual accuracyHopefully the interviewer won t be disappointed that we cannot solve this problem for any string that s greater than MAX LENGTH but overall it achieved a accuracy I didn t deal with variable length inputs in this case we could ve easily done that by adding padding Now instead of just printing out the number of unique characters print out the actual unique characters in the order they appear Ex “acabdb gt “acbd Oof Thats a much tougher problem In this case the value we need to return is a sequence of characters instead of a single character or number so we will need some form of sequence to sequence model in order to learn this relationship import numpy as npimport tensorflow as tffrom tensorflow contrib import rnnimport random abc gt abc aabbac gt abc abacd gt abcd MAX LENGTH Max length of chars a b c d e f all chars chars Space for paddingNUM EXAMPLES Args n number of examples to generate Returns strings list of strings that may contain duplicates solutions strings without duplicates strings v One hot encoding of strings with duplicates without padding solutions v One hot encoding of solutions with padding def generate data n NUM EXAMPLES all chars to idx c i for i c in enumerate all chars strings v np zeros NUM EXAMPLES MAX LENGTH len all chars solutions v np zeros NUM EXAMPLES MAX LENGTH len all chars strings NUM EXAMPLES solutions NUM EXAMPLES for i in range NUM EXAMPLES for l in range MAX LENGTH char random choice chars only sample from valid characters strings i char if char not in solutions i solutions i char Pad solutions strings num missing MAX LENGTH len solutions i solutions i num missing for x in range len strings for y in range MAX LENGTH string char strings x y strings v x y all chars to idx string char solution char solutions x y solutions v x y all chars to idx solution char return strings solutions strings v solutions vAgain we can one hot encode the sequences but this time we will add some padding since the length of the output string is variable Instead of trying to return a variable length sequence we will just return a sequence that is equal to the length of the input and pad the output with spaces For example “abdab would map to a string of the same length but with spaces as padding “abd “ Let s create a test and training split strings solutions strings v solutions v generate data split at len strings len strings strings train strings split at solutions train solutions split at X train strings v split at y train solutions v split at strings test strings split at solutions test solutions split at X test strings v split at y test solutions v split at Now we can set up our model Let s start with the encoder encoded input tf placeholder tf float shape None MAX LENGTH len all chars decoded input tf placeholder tf float shape None MAX LENGTH len all chars with tf name scope basic rnn seqseq as scope encoded sequence tf unstack encoded input MAX LENGTH encoder cell rnn BasicLSTMCell forget bias encoded outputs states rnn static rnn encoder cell encoded sequence dtype tf float And now the decoder with tf name scope lstm decoder as scope decoded sequence tf unstack decoded input MAX LENGTH decoder cell rnn BasicLSTMCell reuse True decoded outputs rnn static rnn decoder cell decoded sequence initial state states dtype tf float Now we can compute the predictions by multiplying the decoder s hidden layer output with a fully connected layer with tf name scope fully connected as scope weights tf get variable weights len all chars initializer tf random normal initializer biases tf get variable biases len all chars initializer tf random normal initializer predictions encoded sequence tf unstack decoded input MAX LENGTH for output in decoded outputs prediction tf add tf matmul output weights biases predictions append prediction concatenated outputs tf stack predictions concatenated outputs tf transpose concatenated outputs perm concatenated inputs tf concat decoded input Now we can compute the loss cost tf reduce mean tf nn softmax cross entropy with logits logits concatenated outputs labels concatenated inputs FC Layeroptimizer tf train AdamOptimizer e train op optimizer minimize cost And now let s run our model def decode guess one hot return join all chars m for m in np argmax one hot axis init tf global variables initializer costs with tf Session as sess sess run init for i in range e r c t c out c in sess run encoded outputs predictions cost train op concatenated outputs concatenated inputs feed dict encoded input X train decoded input y train costs append c if i print training cost epoch format c i results sess run predictions feed dict encoded input X test decoded input y test guesses np array results transpose for i in range string strings test i solution solutions test i guess decoded decode guess guesses i print format string solution guess decoded correct for i in range len strings test string strings test i solution solutions test i guess decoded decode guess guesses i if i lt print input solution prediction format string solution guess decoded if solution guess decoded correct print accuracy format correct len strings test And here are the results training cost epoch cebbdf cebdf dffcbc dfcb aadeab adeb faceec face bfaeec bfaec accuracy training cost epoch cebbdf cebdf cebdf dffcbc dfcb dfcb aadeab adeb adeb faceec face face bfaeec bfaec bfaec accuracyIt looks like by epochs our model has a fairly good understanding of how to solve this problem Please do not solve real interview problems in this way |
2021-11-30 17:19:45 |
海外TECH |
DEV Community |
KAFKA + KSQLDB + .NET #1 |
https://dev.to/vaivoa/kafka-ksqldb-net-1-40g4
|
KAFKA KSQLDB NET Hi I m Ricardo Medeiros NET back end developer vaivoa and today I m going to walk you through using ksqlDB to query messages produced in kafka by a NET C producer For this example I will be deploying my enviroment as containers described in a docker compose file to ensure easy reproducibility of my results The source code used in this example is avaliable here ServicesFirst let s talk about the docker compose environment services the file is avaliable here NET API ProducerAutomaticaly generated NET api with docker compose serviceksqldbdemo container name ksqldbdemo image DOCKER REGISTRY ksqldbdemo build context dockerfile DockerfileThis producer service needs the NET generated dockerfile shown below FROM mcr microsoft com dotnet aspnet AS baseWORKDIR appEXPOSE EXPOSE FROM mcr microsoft com dotnet sdk AS buildWORKDIR srcCOPY ksqlDBDemo csproj RUN dotnet restore ksqlDBDemo csproj COPY WORKDIR src RUN dotnet build ksqlDBDemo csproj c Release o app buildFROM build AS publishRUN dotnet publish ksqlDBDemo csproj c Release o app publishFROM base AS finalWORKDIR appCOPY from publish app publish ENTRYPOINT dotnet ksqlDBDemo dll ZooKeeperDespite not been necessary since Kafka ZooKeeper coordinates kafka tasks defining controllers cluster membership topic configuration and more In this tutorial it s used the confluent inc ZooKeeper image due to it s use in the reference material It makes Kafka more reliable but adds complexity into the system zookeeper image confluentinc cp zookeeper hostname zookeeper container name zookeeper ports environment ZOOKEEPER CLIENT PORT ZOOKEEPER TICK TIME KafkaKafka is an event streaming plataform capable of handling trillions of events a day Kafka is based on the abstraction of an distributed commit log Initialiy developed at LinkedIn in to work as a message queue but it has evolved into a full fledge event streanming platfmorm Listed as broker in the services is the core of this tutorial It s configuration is tricky but using it as follows worked well in this scenario broker image confluentinc cp kafka hostname broker container name broker depends on zookeeper ports environment KAFKA BROKER ID KAFKA ZOOKEEPER CONNECT zookeeper KAFKA LISTENER SECURITY PROTOCOL MAP PLAINTEXT PLAINTEXT PLAINTEXT HOST PLAINTEXT KAFKA ADVERTISED LISTENERS PLAINTEXT broker PLAINTEXT HOST localhost KAFKA OFFSETS TOPIC REPLICATION FACTOR KAFKA GROUP INITIAL REBALANCE DELAY MS KAFKA TRANSACTION STATE LOG MIN ISR KAFKA TRANSACTION STATE LOG REPLICATION FACTOR ksqlDBksqlDB is a database built to allow distributed stream process applications Made to work seamsly with kafka it has a server that runs outside of kafka with a REST API and a CLI application that can be run separatly and it s used in this tutorial ksqlDB ServerIn this example it s used the confluent inc image of the ksqlDB server once more due to it s widespread usage ksqldb server image confluentinc ksqldb server hostname ksqldb server container name ksqldb server depends on broker ports environment KSQL LISTENERS KSQL BOOTSTRAP SERVERS broker KSQL KSQL LOGGING PROCESSING STREAM AUTO CREATE true KSQL KSQL LOGGING PROCESSING TOPIC AUTO CREATE true ksqlDB CLIThe same goes for the ksqlDB CLI service that also use the confluent inc image ksqldb cli image confluentinc ksqldb cli container name ksqldb cli depends on broker ksqldb server entrypoint bin sh tty true KafdropKafdrop is a Web UI for viewing kafka topics and browsing consumer groups It makes kafka more accessible kafdrop container name kafdrop image obsidiandynamics kafdrop latest depends on broker ports environment KAFKA BROKERCONNECT broker TutorialNow it s the time that you have been waiting let s make it work EnviromentFor this tutorial you ll need a docker desktop installation either it s on a Linux distribution or on Windows with WSL and git Cloning the projectA Visual Studio project is avaliable here it has docker support and already deploys all the services needed for this demo in the IDE However you will be fine if you don t want or can t use Visual Studio Just clone it running the following comand on the terminal and directory of your preference git clone Use the following command to move to the project folder cd ksqlDBDemoAnd in the project folder that contains the docker compose yml run the following command to deploy the services docker compose up dafter this command make sure that all services are running Sometimes services fall but it is okay In order to see if everything is running ok it s possible to see the services running in docker desktop as shown bellow Or you can execute the following command docker psWhich should output something like this CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMESbcefd ksqldbdemo ksqldbdemo dotnet ksqlDBDemo d… hours ago Up hours gt tcp gt tcp ksqldbdemoa confluentinc ksqldb cli bin sh hours ago Up hours ksqldb clidee obsidiandynamics kafdrop latest kafdrop sh hours ago Up hours gt tcp kafdropcaee confluentinc ksqldb server usr bin docker run hours ago Up hours gt tcp ksqldb servercefd confluentinc cp kafka etc confluent dock… hours ago Up hours tcp gt tcp brokerfaba confluentinc cp zookeeper etc confluent dock… hours ago Up hours tcp gt tcp tcp zookeeper WEB APINow with all services up and running we can access the WEB API Swagger to populate our Kafka topics The code is very simple and it s avaliable in the repository The WEB API swagger is deployed at http localhost swagger index html As shown in the image bellow it has two endpoints and they create events that could be created by indepent microservices One for creating an event that creates a userName in the system and another that takes an Id and generates a three digit code Then you can create an User with the user name of your choise as shown And it will have an assigned unique Id as demonstrated Now you can get a three digit code for your user Id as displayed And a random code is generated for the selectd as we can observe in the image that follows KafdropWe can use the kafdrop UI the check if everything is okay Kafdrop is deployed at http localhost There you will find all the brokers and topics avaliable It should look like this KSQL CLIAfter all that you ll be able to create your streams of data and query it using ksqlDB On your preferential terminal use the command docker exec it ksqldb cli ksql http ksqldb server Creating streamsAnd then you are in the ksql CLI and are free to create your streams and queries First let s create a stream for each one of our topics CREATE STREAM stream user Name VARCHAR Id VARCHAR WITH kafka topic demo user value format json partitions CREATE STREAM stream code Id VARCHAR code INT WITH kafka topic demo code value format json partitions Create a materialized viewYou can join the client data with the most recent randomized code to achieve this you must create a materialized view table that joins both streams as seen in the ksqldb script that follows CREATE TABLE currentCodeView AS gt SELECT user Name gt LATEST BY OFFSET code code AS CurrentCode gt FROM stream code code INNER JOIN stream user user gt WITHIN DAYS ON code Id user Id gt GROUP BY user Name gt EMIT CHANGES Making a push queryAfter that we can query this materialized view SELECT FROM currentCodeView EMIT CHANGES This push query keep on running until you hit cntrl c to cancel it ConclusionsIn this tutorial it s demonstrated that in a kafka ksqlDB enviroment you can make SQL queries and also join on data that comes from different events which is one of most complexities envolved with microsservices systems And it is what ksqlDB solves by enabling SQL operations over Kafka topics It s my goal to explore the possibilites allowed by this ecosystem and I hope to bring more knowledge on this topic in another articles here Any sugestions comments or corrections fell free to reach me out at LinkedIn ltag user id follow action button background color important color ffffff important border color important Ricardo MedeirosFollow Mensaging microservices kafka ksqldb and dotnet explorer Middle Back End developper vaivoa ReferencesksqlDB QuickstartksqlDB OverviewKafka NET ClientksqlDB Documentation Data Types OverviewKSQL and ksqlDBWelcome to Apache ZooKeeperWhat is ZooKeeper amp How Does it Support Kafka What is Apache Kafka ksqlDB The database purpose built for stream processing applicationsAn overview of ksqlDBCREATE TABLE AS SELECTHow to join a stream and a stream Time and Windows in ksqlDB Queries DisclaimerA VaiVoa incentiva seus Desenvolvedores em seu processo de crescimento e aceleração técnica Os artigos publicados não traduzem a opinião da VaiVoa A publicação obedece ao propósito de estimular o debate |
2021-11-30 17:19:05 |
海外TECH |
DEV Community |
TypeScript: Any vs Unknown |
https://dev.to/luisvonmuller/typescript-any-vs-unknown-4gk9
|
|
2021-11-30 17:08:00 |
Apple |
AppleInsider - Frontpage News |
Union's 30th anniversary celebrated with new Union Beats Studio Buds |
https://appleinsider.com/articles/21/11/30/unions-30th-anniversary-celebrated-with-new-union-beats-studio-buds?utm_medium=rss
|
Union x s th anniversary celebrated with new Union Beats Studio BudsBeats by Dre is releasing a special edition of its Beats Studio Buds earphones collaborating with Union to commemorate the clothing store s th anniversary Available only through Union s Los Angeles and Tokyo stores and the retailer s website starting from December the new option provides a different color scheme than the usual Beats offering In this instance the earbuds combine red black and green from the Pan African flag While the earbuds feature the Beats logo the charging case uses the same color scheme but with the Union logo emblazoned on the front While the colors are new the price is the usual Read more |
2021-11-30 17:22:27 |
Apple |
AppleInsider - Frontpage News |
Apple to overtake Samsung in Q4 2021, 5G iPhone SE coming in 2022 |
https://appleinsider.com/articles/21/11/30/apple-to-overtake-samsung-in-q4-2021-5g-iphone-se-coming-in-2022?utm_medium=rss
|
Apple to overtake Samsung in Q G iPhone SE coming in Apple has regained its spot as the second largest smartphone manufacturer in the world in the third quarter of and the iPhone models will likely drive it to first in Q The Cupertino tech giant had a share of the market in the third quarter of which spans the period between July and September according to analysis firm Trendforce which ranked vendors on smartphone production and not sales Quarter over quarter Apple s four iPhone and iPhone Pro models drove a increase in iPhone production in Q The company produced an estimated million iPhone models during the period Read more |
2021-11-30 17:03:24 |
海外TECH |
Engadget |
Russia may press criminal charges in 2018 ISS pressure leak incident |
https://www.engadget.com/roscosmos-2018-iss-report-174303244.html?src=rss
|
Russia may press criminal charges in ISS pressure leak incidentIn astronauts aboard the International Space Station plugged a mm quot hole quot in a Soyuz MS vehicle that had docked with the station in June of that year While the pressure leak never posed an immediate threat to those aboard it set off a bizarre turn of events that saw Russia open an investigation to find out if the incident was the result of sabotage Per an RIA Novosti article spotted by Ars Technica the country s Roscosmos space agency recently completed its probe of the event and sent the results to Russian law enforcement officials opening the door for them to announce criminal charges Roscosmos didn t say anything official about the cause of the pressure leak but that hasn t stopped Russian media from spreading misinformation The RIA Novosti article references Russian media reports that allege the hole may have been drilled by NASA astronaut Serena Auñón Chancellor a crew member of the ISS at the time of the incident Specifically per Russia s TASS nbsp news agency the country s Izvestia newspaper claimed Aunon Chancellor may have drilled the hole out of a quot desire to return to Earth because of a blood clot or a fight with her onboard the International Space Station quot Citing its own source TASS claims quot the hole had been drilled in weightlessness by a person not acquainted with the spaceship s design quot According to NASA the possibility that its astronauts were involved in creating the pressure leak is non existent As Ars Technica notes NASA knew the location of all of its astronauts before the leak started and the moment it began None of the US astronauts aboard the ISS at the time of the incident were near the Russian compartment where the Soyuz was docked when it started leaking air The US shared this information with Russia when Roscosmos began its investigation in quot These attacks are false and lack any credibility quot NASA Administrator Bill Nelson told the outlet quot I fully support Serena and stand behind all of our astronauts quot We ve reached out to NASA for additional information The accusations come at a time when the relationship between NASA and Roscosmos is already fraught On November th Russia conducted an anti satellite missile test that created a debris field that forced astronauts on the ISS to seek shelter aboard their spacecraft The US condemned the trial accusing the country of putting everyone aboard the ISS including Russian cosmonauts in danger |
2021-11-30 17:43:03 |
海外TECH |
Engadget |
Bethesda shows off more 'Starfield' in a seven-minute featurette |
https://www.engadget.com/starfield-featurette-bethesda-todd-howard-171417817.html?src=rss
|
Bethesda shows off more x Starfield x in a seven minute featuretteStarfield nbsp is just under a year away from landing on PC and Xbox Series X S and Bethesda has offered another peek at what s in store with a mini documentary The seven minute quot Into the Starfield The Endless Pursuit quot featurette shows a lot of concept art and brief shots of things like robots alien worlds and a spaceport The video is centered around the evolution of Bethesda Game Studios and the worlds it has built over the years Given that many of the studio s games are about exploration such as those in the Elder Scrolls and Fallout series progressing to space exploration with Starfield is a logical next step Art director Matt Carofano noted the upcoming game has a quot more realistic science based backing to it quot than say the fantasy world of Skyrim Game director Todd Howard also offered a quot cryptic quot tease He said Starfield has quot two step out moments quot Many other games typically only have one of those in which the player sees the expanse of an open world environment for the first time There isn t a ton of detail about what Starfield is in this video but it gives folks who are excited about the game a little more insight There will be more episodes of quot Into the Starfield quot in the coming months as the release date edges closer Starfield will arrive on November th |
2021-11-30 17:14:17 |
海外TECH |
Engadget |
Jack Dorsey took on Twitter’s biggest problems, but leaves plenty of challenges for his successor |
https://www.engadget.com/jack-dorsey-leaves-twitter-parag-agrawal-170029144.html?src=rss
|
Jack Dorsey took on Twitter s biggest problems but leaves plenty of challenges for his successorAfter a six year stint as CEO again Jack Dorsey is leaving Twitter in a very different place than when he took it over in Back then not everyone was excited about the return of the company s cofounder Even though he initially came back temporarily employees and investors were concerned that dual CEO roles ーhe was and still is the CEO of Square ーwould keep him from being able to tackle the company s many problems “The general feeling among Twitter employees now is trepidation The New York Times wrote in of Dorsey s surprise return “Many are concerned at the prospect of Mr Dorsey s interim title becoming permanent given his divisive and sometimes erratic management style and the fact that he had been dismissed and returned to the company before At the time the company was often described as being “in turmoil Twitter was churning through executives and investors were concerned about lackluster user growth Journalists and other pundits often noted that Twitter never knew how to explain what it was or why it mattered The actual service had barely changed in years Harassment was rampant and relatively unchecked Much has changed since then Hand wringing over Dorsey s two jobs never really abated but turnover at the top of the company eventually slowed and Twitter started growing again The platform still struggles with harassment but has made a concerted effort at encouraging “healthy conversations and has significantly ramped up its policies against hate speech and harassment More recently the company has undertaken a number of ambitious initiatives to change its core features and create new sources of revenue In the last year alone Twitter has introduced new features for live audio groups and payments It rolled out creator focused features like Super Follows and acquired a newsletter platform for longform content Last month it introduced Twitter Blue a subscription service aimed at power users The company is also in the early stages of BlueSky a plan to create a decentralized standard for social media platforms But incoming CEO Parag Agrawal will still be inheriting significant challenges alongside all the shiny new projects Though the company has made strides in increasing conversational “health it s also grappled with where to draw the line between free speech and toxicity particularly when political figures are involved And like other platforms the company struggled to rein in misinformation during the COVID pandemic and presidential election “Dorsey leaves behind a mixed legacy a platform that s useful and potent for quick communication but one that s been exploited by a range of bad actors including former President Donald Trump who did his best on Twitter to undermine democracyーuntil Dorsey s people finally had enough and shut him down says Paul Barrett deputy director of the NYU Stern Center for Business and Human Rights who has researched social media polarization That Twitter under Dorsey did eventually permanently ban Trump has only made the company more of a target for politicians And that s unlikely to change just because Twitter s new CEO has been one of the company s lowest profile executives Agrawal is taking over as social media platforms face a bigger reckoning about their role in society As lawmakers eye regulating algorithms and other reforms Twitter has started to research algorithmic amplification and potential “unintentional harms caused by its ranking systems It will now be up to the company s former CTO to steer that work while navigating scrutiny from lawmakers Agrawal will also inherit ambitious goals Twitter set earlier this year To double its revenue and grow its user base to million monetizable daily active users mDAU by the end of the company reported million mDAU in its most recent earnings report And there are some signs he may be well positioned to make that happen While Twitter under Dorsey has been slow to make decisions and release updates Agrawal has been a proponent of new features like Bitcoin tipping He also over saw Bluesky the decentralization project The company has been betting that moving away from advertising and leaning into subscription services and other new features will help it get there But Twitter is hardly alone in pursuing creators and subscriptions and it s not clear the company will be able to easily persuade large swaths of users to start paying for extra content or premium features Twitter s new CEO seems to be well aware of the challenges ahead “We recently updated our strategy to hit ambitious goals and I believe that strategy to be bold and right Agrawal wrote in an email to employees he shared on Twitter “But our critical challenge is how we work to execute against it and deliver results |
2021-11-30 17:00:29 |
海外科学 |
NYT > Science |
What We Know About the New Covid Variant, Omicron |
https://www.nytimes.com/article/omicron-coronavirus-variant.html
|
What We Know About the New Covid Variant OmicronIntense research into the new coronavirus variant first identified in southern Africa has just begun World leaders have urged people not to panic ーand to get vaccinated if they can |
2021-11-30 17:08:16 |
海外科学 |
NYT > Science |
Counterfeit Covid Masks Are Still Sold Everywhere |
https://www.nytimes.com/2021/11/30/health/covid-masks-counterfeit-fake.html
|
Counterfeit Covid Masks Are Still Sold EverywhereRising Covid cases have spurred a return to mask wearing in the U S and overseas at a time when flawed KNs from China continue to dominate e commerce sites |
2021-11-30 17:32:47 |
海外科学 |
NYT > Science |
Omicron Has Scary Mutations. That Doesn't Mean They Work Well Together |
https://www.nytimes.com/2021/11/29/health/omicron-covid-mutation-epistasis.html
|
Omicron Has Scary Mutations That Doesn x t Mean They Work Well TogetherMutations can work together to make a virus more fearsome but they can also cancel one another out This phenomenon called epistasis is why scientists are reluctant to speculate on Omicron |
2021-11-30 17:35:09 |
海外科学 |
NYT > Science |
Will the Covid Vaccines Stop Omicron? Scientists Are Racing to Find Out. |
https://www.nytimes.com/2021/11/28/health/covid-omicron-vaccines-immunity.html
|
Will the Covid Vaccines Stop Omicron Scientists Are Racing to Find Out A “Frankenstein mix of mutations raises concerns but the variant may remain vulnerable to current vaccines If not revisions will be necessary |
2021-11-30 17:33:26 |
海外科学 |
NYT > Science |
How Did the New Covid Variant, Omicron, Get Its Name? |
https://www.nytimes.com/2021/11/27/world/africa/omicron-covid-greek-alphabet.html
|
confusion |
2021-11-30 17:34:07 |
海外科学 |
NYT > Science |
New 'Omicron' Variant Stokes Concern but Vaccines May Still Work |
https://www.nytimes.com/2021/11/26/health/omicron-variant-vaccines.html
|
New x Omicron x Variant Stokes Concern but Vaccines May Still WorkThe Omicron variant carries worrisome mutations that may let it evade antibodies scientists said But it will take more research to know how it fares against vaccinated people |
2021-11-30 17:36:09 |
金融 |
ニュース - 保険市場TIMES |
損保ジャパンら、「東北電力フロンティア くらしのシンプル保険(賃貸タイプ)」提供開始 |
https://www.hokende.com/news/blog/entry/2021/12/01/030000
|
損保ジャパンら、「東北電力フロンティアくらしのシンプル保険賃貸タイプ」提供開始サブスクリプション型の保険商品東北電力フロンティア株式会社以下、東北電力フロンティア、損害保険ジャパン株式会社以下、損保ジャパン、損保ジャパンの子会社であり、少額短期保険を提供するMysurance株式会社以下、Mysuranceは年月日、「東北電力フロンティアくらしのシンプル保険賃貸タイプ」の提供を開始した。 |
2021-12-01 03:00:00 |
ニュース |
BBC News - Home |
Covid: Booster offer for all adults in England by end of January |
https://www.bbc.co.uk/news/uk-59481992?at_medium=RSS&at_campaign=KARANGA
|
omicron |
2021-11-30 17:39:53 |
ニュース |
BBC News - Home |
Covid in Scotland: All nine Omicron cases linked to single event |
https://www.bbc.co.uk/news/uk-scotland-59473564?at_medium=RSS&at_campaign=KARANGA
|
november |
2021-11-30 17:27:09 |
ニュース |
BBC News - Home |
Arthur Labinjo-Hughes 'knew his dad was going to kill him' |
https://www.bbc.co.uk/news/uk-england-birmingham-59475076?at_medium=RSS&at_campaign=KARANGA
|
arthur |
2021-11-30 17:20:18 |
ニュース |
BBC News - Home |
Covid: Greece to fine over-60s who refuse Covid-19 vaccine |
https://www.bbc.co.uk/news/world-europe-59474808?at_medium=RSS&at_campaign=KARANGA
|
january |
2021-11-30 17:19:35 |
ニュース |
BBC News - Home |
Ray Kennedy: Former Liverpool and Arsenal midfielder dies aged 70 |
https://www.bbc.co.uk/sport/football/59480532?at_medium=RSS&at_campaign=KARANGA
|
kennedy |
2021-11-30 17:37:31 |
ニュース |
BBC News - Home |
Favourite Trump suffers surprise defeat against Selt at UK Championship |
https://www.bbc.co.uk/sport/snooker/59481650?at_medium=RSS&at_campaign=KARANGA
|
championship |
2021-11-30 17:36:02 |
ニュース |
BBC News - Home |
Omicron: What Covid rules are being toughened in the UK? |
https://www.bbc.co.uk/news/explainers-52530518?at_medium=RSS&at_campaign=KARANGA
|
omicron |
2021-11-30 17:11:00 |
ビジネス |
ダイヤモンド・オンライン - 新着記事 |
楽しいことを長持ちさせる方法 - 精神科医Tomyが教える 1秒で悩みが吹き飛ぶ言葉 |
https://diamond.jp/articles/-/270700
|
voicy |
2021-12-01 02:50:00 |
ビジネス |
ダイヤモンド・オンライン - 新着記事 |
AIよりも人間の「孤独」が最強の力となる理由 - 孤独からはじめよう |
https://diamond.jp/articles/-/288589
|
自分 |
2021-12-01 02:45:00 |
ビジネス |
ダイヤモンド・オンライン - 新着記事 |
冨永愛が、肌の潤いをアップするために 生野菜より食べているもの - 冨永愛 美をつくる食事 |
https://diamond.jp/articles/-/288799
|
冨永愛が、肌の潤いをアップするために生野菜より食べているもの冨永愛美をつくる食事年以上トップモデルとして活躍する冨永愛。 |
2021-12-01 02:40:00 |
ビジネス |
ダイヤモンド・オンライン - 新着記事 |
人生の二大収入は 「退職金」と、 もう一つは? - 知らないと大損する! 定年前後のお金の正解 |
https://diamond.jp/articles/-/288912
|
人生の二大収入は「退職金」と、もう一つは知らないと大損する定年前後のお金の正解何歳までこの会社で働くのか退職金はどうもらうのか定年後も会社員として働くか、独立して働くか年金を何歳から受け取るか住まいはどうするのか定年が見えてくるに従い、自分で決断しないといけないことが増えてきます。 |
2021-12-01 02:35:00 |
ビジネス |
ダイヤモンド・オンライン - 新着記事 |
【黙っちゃうのはダメ絶対】戦略コンサルの面接で行き詰った時にこの方法を試して! - 戦略コンサルティング・ファームの面接試験 新版 |
https://diamond.jp/articles/-/288973
|
面接対策 |
2021-12-01 02:30:00 |
ビジネス |
ダイヤモンド・オンライン - 新着記事 |
年金を不正受給していた!? 身近な人が亡くなったら、すぐやるべきこと - ぶっちゃけ相続「手続大全」 |
https://diamond.jp/articles/-/289130
|
身近 |
2021-12-01 02:25:00 |
ビジネス |
ダイヤモンド・オンライン - 新着記事 |
優秀なインタビュアーは取材中、どんなふうにメモを取る? - 行列のできるインタビュアーの聞く技術 |
https://diamond.jp/articles/-/288893
|
行列 |
2021-12-01 02:20:00 |
ビジネス |
ダイヤモンド・オンライン - 新着記事 |
『最高品質の会議術』【試読】 - ダイヤモンド・プレミアム会員向け書籍コンテンツ試読版 |
https://diamond.jp/articles/-/279057
|
|
2021-12-01 02:15:00 |
ビジネス |
ダイヤモンド・オンライン - 新着記事 |
ソフトバンク、韓国メタバースのゼペットに出資 - WSJ発 |
https://diamond.jp/articles/-/289180
|
韓国 |
2021-12-01 02:14:00 |
ビジネス |
ダイヤモンド・オンライン - 新着記事 |
方言交え、佐賀名産の魅力を発信 ライブ感あるオンライン販売を展開 - しんきん経営情報-ウチのイチ押し! |
https://diamond.jp/articles/-/289092
|
方言交え、佐賀名産の魅力を発信ライブ感あるオンライン販売を展開しんきん経営情報ウチのイチ押しオンラインショップに並ぶのは、生産量・品質共に日本一といわれる佐賀名産の有明海産海苔やその海苔を練り込んだオリジナルの麺製品、有田焼などの伝統工芸品なども含め超アイテム。 |
2021-12-01 02:10:00 |
ビジネス |
ダイヤモンド・オンライン - 新着記事 |
ハイエンドな医療サービスを提供する新たな仕組みで、医療の社会課題を解決 - しんきん経営情報-トップインタビュー |
https://diamond.jp/articles/-/289086
|
経営情報 |
2021-12-01 02:05:00 |
北海道 |
北海道新聞 |
6・5億円詐取疑い逮捕へ 地面師か、渋谷の土地取引 |
https://www.hokkaido-np.co.jp/article/617573/
|
不動産会社 |
2021-12-01 02:08:00 |
コメント
コメントを投稿