IT |
ITmedia 総合記事一覧 |
[ITmedia Mobile] 独自プロセッサ×高画素カメラの効果はいかに? 「Pixel 6 Pro」全力レビュー(前編) |
https://www.itmedia.co.jp/mobile/articles/2110/26/news076.html
|
google |
2021-10-26 02:30:00 |
AWS |
AWS Partner Network (APN) Blog |
Seamless Transition from an AWS Landing Zone to AWS Control Tower |
https://aws.amazon.com/blogs/apn/seamless-transition-from-an-aws-landing-zone-to-aws-control-tower/
|
Seamless Transition from an AWS Landing Zone to AWS Control TowerA well architected multi account AWS environment helps businesses use AWS to migrate modernize and innovate faster Many customers currently using the self managed AWS Landing Zone solution are looking to transition to AWS Control Tower to gain additional benefits This post describes a strategic collaboration between Tech Mahindra AWS and a customer in Africa to transition from AWS Landing Zone to the AWS Control Tower environment |
2021-10-25 17:36:39 |
AWS |
AWS Compute Blog |
Building a difference checker with Amazon S3 and AWS Lambda |
https://aws.amazon.com/blogs/compute/building-a-difference-checker-with-amazon-s3-and-aws-lambda/
|
Building a difference checker with Amazon S and AWS LambdaThis blog post shows how to create a scalable difference checking tool for objects stored in S buckets The Lambda function is invoked when S writes new versions of an object to the bucket This example also shows how to remove earlier versions of object and define a set number of versions to retain |
2021-10-25 17:08:10 |
AWS |
AWS Government, Education, and Nonprofits Blog |
Why moving to the cloud should be part of your sustainability strategy |
https://aws.amazon.com/blogs/publicsector/why-moving-cloud-part-of-sustainability-strategy/
|
Why moving to the cloud should be part of your sustainability strategyA recent report by Research part of S amp P Global Market Intelligence commissioned by the Amazon Web Services AWS Institute reveals that moving enterprise and public sector IT workloads from on premises data centers to the cloud can reduce energy consumption and associated carbon emissions by nearly and is five times more energy efficient than the typical on premises Asia Pacific APAC data center |
2021-10-25 17:28:12 |
js |
JavaScriptタグが付けられた新着投稿 - Qiita |
Azure Functions と LINE Notify の組み合わせ(Node.js を利用、ポータルで開発) |
https://qiita.com/youtoy/items/1f4dcd6dca2ea3a2fd97
|
AzureFunctions周りの開発や設定は、Azureのポータル上で行いました。 |
2021-10-26 02:34:07 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
1桁の正整数と加算、乗算、括弧(入れ子なし)に対応した電卓 |
https://teratail.com/questions/366219?rss=all
|
桁の正整数と加算、乗算、括弧入れ子なしに対応した電卓前提・実現したいことC言語で電卓を作っています。 |
2021-10-26 02:32:33 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
js:PC版はメガメニューSP版はハンバーガーでアコーディオン |
https://teratail.com/questions/366218?rss=all
|
jsPC版はメガメニューSP版はハンバーガーでアコーディオン表題の通りなのですが、PC版ではメガメニューで、メニューをホバーするとメニュー項目が表示されます。 |
2021-10-26 02:04:08 |
Azure |
Azureタグが付けられた新着投稿 - Qiita |
Azure Functions と LINE Notify の組み合わせ(Node.js を利用、ポータルで開発) |
https://qiita.com/youtoy/items/1f4dcd6dca2ea3a2fd97
|
AzureFunctions周りの開発や設定は、Azureのポータル上で行いました。 |
2021-10-26 02:34:07 |
海外TECH |
Ars Technica |
“Uh, no”—Pfizer scientist denies Holmes’ claim that Pfizer endorsed Theranos tech |
https://arstechnica.com/?p=1807218
|
theranos |
2021-10-25 17:18:57 |
海外TECH |
Ars Technica |
macOS 12 Monterey: The Ars Technica review |
https://arstechnica.com/?p=1806212
|
iterative |
2021-10-25 17:00:56 |
海外TECH |
MakeUseOf |
11 Great Reasons You Should Subscribe to Amazon Prime Video |
https://www.makeuseof.com/reasons-to-subscribe-to-amazon-prime-video/
|
Great Reasons You Should Subscribe to Amazon Prime VideoAmazon Prime Video is one of the most popular streaming services worldwideーand for good reason Here s why Amazon Prime Video is worth your money |
2021-10-25 17:42:54 |
海外TECH |
MakeUseOf |
Can’t Drag And Drop in Windows 10? Here's How to Fix That |
https://www.makeuseof.com/cant-drag-and-drop-windows-10/
|
thatif |
2021-10-25 17:31:50 |
海外TECH |
MakeUseOf |
Google Meet Wants to Make Meetings More Productive: Here's How |
https://www.makeuseof.com/google-meet-to-make-meetings-more-productive/
|
audio |
2021-10-25 17:14:37 |
海外TECH |
MakeUseOf |
How to Spot Fake News With This Handy Tool |
https://www.makeuseof.com/how-to-spot-fake-news-with-credder/
|
credder |
2021-10-25 17:02:12 |
海外TECH |
DEV Community |
Generate realtime GitHub contribution chart using puppeteer and update it realtime in your twitter banner. |
https://dev.to/gillarohith/generate-realtime-github-contribution-chart-using-puppeteer-and-update-it-realtime-in-your-twitter-banner-3l32
|
Generate realtime GitHub contribution chart using puppeteer and update it realtime in your twitter banner Generate realtime GitHub contribution chart using puppeteer and update it realtime in your twitter banner Build such amazing dynamic realtime updated images with the help of Node JS and puppeteer IntroductionWe usually tend to like dynamically generated content more it has a bit more features and it feels cool Just an example of such image is the one below this is directly being generated from a cloud function P S Note it may take some time to generate it depends on multiple factors We will be learning on how to use Puppeteer customising the content and many more Let s dive right into the content Pre requisitesBasic NodeJSTypeScriptTwitter Developer account if you want real time banner automation mins of your time What are we going to build We are going to build a script that generates such images You can see my live Github contribution graph along with the image in my twitter header Twitter gillarohithIf we observe this image is mixture of two images and some custom writing on them DevelopmentThis section has been divided into multiple sub sections so that it would be easier to follow You can use npm or yarn or pnpm as your package manager just need to replace the commands appropriately For the rest of the steps I will be using yarn as my package manager Setup the applicationLet s create a folder initialise an empty node application mkdir github live bannercd github live banneryarn init yWe would need puppeteer dotenv as dependencies Psst We will be adding a couple more dependencies by end of the post stay tuned Since we will be using typescript we will need typescript ts node and nodemon as the devDependenciesyarn add puppeteer dotenvyarn add D typescript ts node types node nodemonOnce we them installed we are now ready to configure our scripts scripts start node dist index js watch tsc w dev nodemon dist index js build tsc postinstall npm run build The watch script runs ts node to run in watch mode that is it listens to changes in typescript file and complies them to js files as soon as we save them for the development time you can keep it running in the background The dev script using nodemon to run the dist index js file as soon as it gets changed postinstall build and start will be needing during and after the deploy Since we are using typescript we need tsconfig json file You can generate one using a command line utility function npx tsconfig json incase if the above command doesn t work you can find the config file below compilerOptions target es module commonjs lib dom es es esnext asynciterable skipLibCheck true sourceMap true outDir dist moduleResolution node removeComments true noImplicitAny true strictNullChecks true strictFunctionTypes true noImplicitThis true noUnusedLocals true noUnusedParameters true noImplicitReturns true noFallthroughCasesInSwitch true allowSyntheticDefaultImports true esModuleInterop true emitDecoratorMetadata true experimentalDecorators true resolveJsonModule true baseUrl exclude node modules include src ts With this we are good to start the development journey Environment fileWe will be needing twitter credentials if you want to update your banner dynamically You would need to follow the exact same steps to generate required credentials in this article you can check Twitter Developer Account section for detailed instructions with images Develop and Deploy a server less python application that updates Twitter banner in real timeAfter the above steps you will be ready with the following valuesCONSUMER KEYCONSUMER SECRETACCESS TOKENACCESS TOKEN SECRETIn your env file update the details as below CONSUMER KEY your key CONSUMER SECRET your key ACCESS TOKEN your key ACCESS TOKEN SECRET your key Taking screenshot using puppeteerFirst things first we need to initialise a headless chrome instance before we take a screenshot for that the following command would initiate the instance const browser await puppeteer launch the flags are useful when we deploy args no sandbox disable setuid sandbox After opening the browser we need to open a page that can be done using the following command const page await browser newPage We can set viewport size for the clarity and other purposes await page setViewport width height deviceScaleFactor TL DR of deviceScaleFactorThe more the deviceScaleFactor more the clarityThen once the page is opened we need to visit the required page In our tutorial since we are making GitHub contribution graph as banner let s go to our GitHub profile page await page goto GITHUB USERNAME waitUntil networkidle Now we need to wait until the GitHub contribution chart gets populated that can be achieved using selectors For getting the required CSS selectorGo to developer consoleSelect the element which you want to selectRight click on the element →Copy →Copy SelectorThe selector would beconst GITHUB CONTRIBUTION SELECTOR js pjax container gt div container xl px px md px lg gt div gt div flex shrink col col md mb mb md gt div nth child gt div gt div mt position relative gt div gt div col col lg gt div js yearly contributions gt div nth child Now we say puppeteer to wait until the selector is loaded await page waitForSelector GITHUB CONTRIBUTION SELECTOR After this is generated we select the selector and then take screenshot const element await page GITHUB CONTRIBUTION SELECTOR if element await element screenshot path contributions png Boom now you can see contributions png in your local file system Putting all togetherimport puppeteer from puppeteer const GITHUB USERNAME Rohithgilla const GITHUB CONTRIBUTION SELECTOR js pjax container gt div container xl px px md px lg gt div gt div flex shrink col col md mb mb md gt div nth child gt div gt div mt position relative gt div gt div col col lg gt div js yearly contributions gt div nth child const main async gt const browser await puppeteer launch args no sandbox disable setuid sandbox const page await browser newPage await page setViewport width height deviceScaleFactor await page goto GITHUB USERNAME waitUntil networkidle await page waitForSelector GITHUB CONTRIBUTION SELECTOR const element await page GITHUB CONTRIBUTION SELECTOR if element await element screenshot path contributions png await browser close console log Done creating the screenshot main Puppeteer CustomisationsBut now if we observe there are a few things which we want to change in the screenshot Dark mode Remove the text Learn how we count contributions from the image Add some padding and margins around the chart Dark ModeFor the dark mode we need to emulate dark mode for that the running following command emulates it We need to run the command after we visit the website await page emulateMediaFeatures name prefers color scheme value dark Hide the unwanted lineWe do the similar method that we have performed in step one to get the CSS selector of the line To help you save trouble I have already got the CSS selector for you const REMOVE SELECTOR js pjax container gt div container xl px px md px lg gt div gt div flex shrink col col md mb mb md gt div nth child gt div gt div mt position relative gt div gt div col col lg gt div js yearly contributions gt div nth child gt div gt div gt div gt div float left Once we select the element we customise the css styles and make the display to none puppeteer hide the selected elementawait page evaluate selector gt const element document querySelector selector element style display none REMOVE SELECTOR Adding margins and paddingsWe need to add margins and padding around the contribution selector const CONTRIBUTION SELECTOR js pjax container gt div container xl px px md px lg gt div gt div flex shrink col col md mb mb md gt div nth child gt div gt div mt position relative gt div gt div col col lg gt div js yearly contributions gt div nth child gt h await page evaluate selector gt const element document querySelector selector element style margin px element style paddingTop px CONTRIBUTION SELECTOR Now the customisations can go endless like customising the colors sizes and more Putting everything together import puppeteer from puppeteer const GITHUB USERNAME Rohithgilla const GITHUB CONTRIBUTION SELECTOR js pjax container gt div container xl px px md px lg gt div gt div flex shrink col col md mb mb md gt div nth child gt div gt div mt position relative gt div gt div col col lg gt div js yearly contributions gt div nth child const REMOVE SELECTOR js pjax container gt div container xl px px md px lg gt div gt div flex shrink col col md mb mb md gt div nth child gt div gt div mt position relative gt div gt div col col lg gt div js yearly contributions gt div nth child gt div gt div gt div gt div float left const CONTRIBUTION SELECTOR js pjax container gt div container xl px px md px lg gt div gt div flex shrink col col md mb mb md gt div nth child gt div gt div mt position relative gt div gt div col col lg gt div js yearly contributions gt div nth child gt h const main async gt const browser await puppeteer launch args no sandbox disable setuid sandbox const page await browser newPage await page setViewport width height deviceScaleFactor await page goto GITHUB USERNAME waitUntil networkidle Dark Mode await page emulateMediaFeatures name prefers color scheme value dark await page waitForSelector GITHUB CONTRIBUTION SELECTOR puppeteer hide the selected element await page evaluate selector gt const element document querySelector selector element style display none REMOVE SELECTOR await page evaluate selector gt const element document querySelector selector element style margin px element style paddingTop px CONTRIBUTION SELECTOR const element await page GITHUB CONTRIBUTION SELECTOR if element await element screenshot path contributions png await browser close console log Done creating the screenshot main Now once we made the changes the screenshot already looks beautiful Node Canvas amp SharpNow its time for some transformations merging fine tuning For this section we would be needing canvas and sharp packages yarn add canvas sharpyarn add D types sharpNow if we see the generated image in the introduction section it includes merging of the two following images You can get such amazing background image from First things first we need to resize the chart image to certain size so that it fits in the background image With sharp we can also do many things one of which is rounding the corners of the image so that it looks nice So do that let s first import the sharp package import sharp from sharp then do some of the magic transformations with it const beforeResize await loadImage filename const toResizeWidth beforeResize width const toResizeHeight beforeResize height const roundedCorners Buffer from lt svg gt lt rect x y width toResizeWidth height toResizeHeight rx ry gt lt svg gt await sharp filename resize toResizeWidth toResizeHeight composite input roundedCorners blend dest in toFile dirname rounded corner png Just as a reference the rounded corner image would look similar to thisNow to finish the banner we need to do the following tasksMerge the imagesWrite text on the imageReturn the bufferMerge the imagesWe don t exactly merge them we create a canvas and put one image over the other for this we use node canvasUsually twitter banners are around X so let s create a canvas of such sizeimport createCanvas loadImage from canvas const canvas createCanvas const ctx canvas getContext d Load the images which we have into the canvasconst img await loadImage dirname rounded corner png const base await loadImage dirname resize base png Draw insert the images on the canvas at the respective positions you like Note that if you are using some custom sizes you may need to do some trail and error stuff here ctx drawImage base ctx drawImage img Note that and are the co ordinates of the imagesWrite text on the imageWriting text on image is the simplest of all the steps We choose font font size and write ctx font px Arial ctx fillStyle white ctx fillText The GitHub contribution chart updated in realtime Here is the co ordinate where the text has to start Then we return the buffer return canvas toBuffer Tip If you want a png file or jpeg file you can use createPNGStream and fs module to do it The code would look something like thiscanvas createPNGStream pipe fs createWriteStream dirname output png Wrapping all things together the function would look like thisimport createCanvas loadImage from canvas import sharp from sharp export const addTextToImage async filename string gt resize is required only for first time await sharp base png resize toFile resize base png const beforeResize await loadImage filename const toResizeWidth beforeResize width const toResizeHeight beforeResize height const roundedCorners Buffer from lt svg gt lt rect x y width toResizeWidth height toResizeHeight rx ry gt lt svg gt await sharp filename resize toResizeWidth toResizeHeight composite input roundedCorners blend dest in toFile dirname rounded corner png const img await loadImage dirname rounded corner png const base await loadImage dirname resize base png const canvas createCanvas const ctx canvas getContext d ctx drawImage base ctx drawImage img ctx font px Arial ctx fillStyle white ctx fillText The GitHub contribution chart updated in realtime return canvas toBuffer Updating twitter bannerNow the fun part where we update our twitter banner with the image which we have generated First things first let us install the twitter package yarn add twitterInitiate the Twitter client const TwitterV require twitter const credentials consumer key process env CONSUMER KEY consumer secret process env CONSUMER SECRET access token key process env ACCESS TOKEN access token secret process env ACCESS TOKEN SECRET const clientV new TwitterV credentials Twitter API accepts the banner in base format so we need to convert the buffer returned from the canvas to base format const base await addTextToImage dirname contributions png console log Done editing the screenshot clientV post account update profile banner banner base toString base err any data any response toJSON gt any gt console log err err const json response toJSON console log json statusCode json headers json body Now open your twitter account and Voila Run it periodicallyTo run the script periodically we use JavaScript setInterval function main setInterval gt main Now this would run main function once in every seconds Putting it all togetherimport puppeteer from puppeteer import addTextToImage from imageUtils const TwitterV require twitter require dotenv config const credentials consumer key process env CONSUMER KEY consumer secret process env CONSUMER SECRET access token key process env ACCESS TOKEN access token secret process env ACCESS TOKEN SECRET const clientV new TwitterV credentials const GITHUB USERNAME Rohithgilla const GITHUB CONTRIBUTION SELECTOR js pjax container gt div container xl px px md px lg gt div gt div flex shrink col col md mb mb md gt div nth child gt div gt div mt position relative gt div gt div col col lg gt div js yearly contributions gt div nth child const REMOVE SELECTOR js pjax container gt div container xl px px md px lg gt div gt div flex shrink col col md mb mb md gt div nth child gt div gt div mt position relative gt div gt div col col lg gt div js yearly contributions gt div nth child gt div gt div gt div gt div float left const CONTRIBUTION SELECTOR js pjax container gt div container xl px px md px lg gt div gt div flex shrink col col md mb mb md gt div nth child gt div gt div mt position relative gt div gt div col col lg gt div js yearly contributions gt div nth child gt h const main async gt try const browser await puppeteer launch args no sandbox disable setuid sandbox const page await browser newPage await page setViewport width height deviceScaleFactor await page goto GITHUB USERNAME waitUntil networkidle Dark Mode await page emulateMediaFeatures name prefers color scheme value dark await page waitForSelector GITHUB CONTRIBUTION SELECTOR puppeteer hide the selected element await page evaluate selector gt const element document querySelector selector element style display none REMOVE SELECTOR await page evaluate selector gt const element document querySelector selector element style margin px element style paddingTop px CONTRIBUTION SELECTOR const element await page GITHUB CONTRIBUTION SELECTOR if element await element screenshot path contributions png await browser close console log Done creating the screenshot const base await addTextToImage dirname contributions png console log Done editing the screenshot clientV post account update profile banner banner base toString base err any data any response toJSON gt any gt console log err err const json response toJSON console log json statusCode json headers json body catch e console error e main setInterval gt main DeploymentWe can simply deploy this into heroku with worker type In the root project create a Procfile and update it s contents as belowworker npm startheroku createheroku buildpacks add jontewks puppeteergit push heroku mainheroku ps scale worker Make sure to add env variables to your heroku project inside config variables section Please let me know if you encounter any issues with the deployment will make a video if needed CodeThe code resides inside heroku branch of this repositoryGitHub Rohithgilla puppeteer github banner at herokuThe other branches corresponds to different deployment methods which I will be updating soon so please stay tuned to it Star the repository and follow me in GitHub it really motivates me to write such amazing content Next Blog PostThe next blog posts are going to be really interesting I have amazing content planned down the road Just a few of them includeMaking docker container on your own and deploying it for free Creating Open Graph image generator Serverless puppeteer functions Follow me to not to miss any update D You can find me on twitter to stay updated Thanks Rohith Gilla |
2021-10-25 17:12:12 |
海外TECH |
DEV Community |
How To Solve Facebook And Instagram oEmbed Issue In WordPress |
https://dev.to/codewatchers_en/how-to-solve-facebook-and-instagram-oembed-issue-in-wordpress-3mk5
|
How To Solve Facebook And Instagram oEmbed Issue In WordPressDo you have embedded Facebook and Instagram posts in your WordPress site They might stop working soon notably the default oEmbed or Embed blocks feature In fact it turns out that as of October there will be a breaking change in the Facebook API This will block the automatic publication of Facebook and Instagram content on your site But do not worry We have the solution to this problem and what we are going to share with you through this article What s the matter with Facebook amp Instagram Embeds So it all started with a Facebook ad making it clear that all oEmbed requests for Facebook and Instagram content will be deprecated on October th It must be said that this API is essential for Gutenberg and the WordPress Classic editor s default embed feature It is thanks to her that it is easy to embe videos pictures updates and other content from Facebook and Instagram Now Facebook requires every developer to register an app and use a client token when getting data from their Graph API for oEmbed content Which is not an option for the WordPress core team Instead it decided to remove Facebook and Instagram embed feature from WordPress core in favor of letting WordPress plugins solve the issue for users core ticket Concretely from October Facebook and Instagram embed in your WordPress content will break This is what Facebook integrations will look like And this is what Instagram integrations will look like Read The Full Tutorial |
2021-10-25 17:02:00 |
Apple |
AppleInsider - Frontpage News |
Hands on with the best new features in macOS Monterey |
https://appleinsider.com/articles/21/10/25/hands-on-with-the-best-new-features-in-macos-monterey?utm_medium=rss
|
Hands on with the best new features in macOS MontereyAfter months of public and developer beta testing macOS Monterey is now widely available for download as a free update for many Mac users We ve been testing it for quite some time and these are our favorite features of Apple s newest release macOS MontereyFaceTime Read more |
2021-10-25 17:39:58 |
Apple |
AppleInsider - Frontpage News |
Apple Fitness+ coming to 15 new countries on Nov. 3 |
https://appleinsider.com/articles/21/10/25/apple-fitness-coming-to-15-new-countries-on-nov-3?utm_medium=rss
|
Apple Fitness coming to new countries on Nov Apple Fitness will become available in a total of additional countries starting Nov including Brazil France Germany Mexico Russia and Saudi Arabia Credit AppleThe company first announced the expansion of Apple Fitness s coverage area back in September On Monday Apple announced that the premium workout service would expand to the new regions on Wednesday Nov Read more |
2021-10-25 17:37:57 |
Apple |
AppleInsider - Frontpage News |
HomePod 15.1 update brings Apple Music Lossless & Dolby Atmos support |
https://appleinsider.com/articles/21/10/25/homepod-151-update-brings-apple-music-lossless-dolby-atmos-support?utm_medium=rss
|
HomePod update brings Apple Music Lossless amp Dolby Atmos supportAn update for the HomePod mini and HomePod has been issued by Apple with HomePod Software Version adding Lossless support along with Dolby Atmos to the company s smart speakers At the time of launching Apple Music Lossless and Spatial Audio Apple said HomePod mini and HomePod would gain support at a later time With Monday s update the speakers gain those very features Surfacing in a September beta update HomePod Software Version brings back the features that were previously intended for Software Version but were pulled before its official release Read more |
2021-10-25 17:34:42 |
Apple |
AppleInsider - Frontpage News |
Apple's tvOS 15.1 is now available for Apple TV |
https://appleinsider.com/articles/21/10/25/apples-tvos-151-is-now-available-for-apple-tv?utm_medium=rss
|
Apple x s tvOS is now available for Apple TVThe newly updated tvOS has just been released bringing the usual slate of bug fixes and restoring a missing feature from the tvOS launch The new Apple TV software update primarily focuses on bug fixes and compatibility SharePlay has also been added after initially being delayed The feature which Apple announced during WWDC allows users to watch video content with others while on a FaceTime call Read more |
2021-10-25 17:18:48 |
Apple |
AppleInsider - Frontpage News |
Apple issues watchOS 8.1 update for Apple Watch |
https://appleinsider.com/articles/21/10/25/apple-issues-watchos-81-update-for-apple-watch?utm_medium=rss
|
Apple issues watchOS update for Apple WatchApple has released its update for watchOS to Apple Watch owners bringing the wearable device s operating system up to watchOS in what is thought to be a bug fix and performance improvement release Apple Watch users can update watchOS by opening the iOS Watch app selecting General then Software Update and following the onscreen prompts The update will also automatically install for the user if set to do so within the same app The Apple Watch being updated has to be at charge or greater placed on a charger throughout the update and be within range of the iPhone to initiate the update itself Read more |
2021-10-25 17:15:00 |
Apple |
AppleInsider - Frontpage News |
Apple releases macOS Monterey with Shortcuts and Live Text |
https://appleinsider.com/articles/21/10/25/apple-releases-macos-monterey-with-shortcuts-and-live-text?utm_medium=rss
|
Apple releases macOS Monterey with Shortcuts and Live TextApple has released macOS Monterey the latest iteration of its Mac and MacBook operating system with users now able to download the final release version to their devices Apple s release follows after beta rounds including two Release Candidate versions It arrives one day ahead of the release date of new MacBook Pro models including the inch MacBook Pro and inch MacBook Pro Users can manually force the update by opening System Preferences on their Mac then select Software Update and then follow the prompts Read more |
2021-10-25 17:18:20 |
Apple |
AppleInsider - Frontpage News |
AirPods 3 review roundup: Better sound and fit despite lack of 'Pro' features |
https://appleinsider.com/articles/21/10/25/airpods-3-review-roundup-better-sound-and-fit-despite-lack-of-pro-features?utm_medium=rss
|
AirPods review roundup Better sound and fit despite lack of x Pro x featuresReviews of Apple s new third generation AirPods are starting to drop highlighting significant improvements to overall audio quality and in ear fit Credit AppleApple s new third generation AirPods feature an updated design that s more reminiscent of the company s AirPods Pro Although they lack the interchangeable ear tips of their more expensive counterpart many reviewers praised the fit and comfort of the new base AirPods Read more |
2021-10-25 17:43:56 |
Apple |
AppleInsider - Frontpage News |
Apple releases iOS 15.1, iPadOS 15.1 with SharePlay, vaccine cards in Wallet |
https://appleinsider.com/articles/21/10/25/apple-releases-ios-151-ipados-151-with-shareplay-vaccine-cards-in-wallet?utm_medium=rss
|
Apple releases iOS iPadOS with SharePlay vaccine cards in WalletApple has released iOS and iPadOS incremental updates to the company s iOS platform that contain new features like verifiable Covid vaccination support in Wallet Credit Andrew O Hara AppleInsiderThe new software updates are currently available as free over the air downloads on compatible iPhone and iPads They can be acquired by heading to General and Software Update in the Settings app Read more |
2021-10-25 17:10:26 |
海外TECH |
Engadget |
Zoom's automatic closed captioning rolls out to all free users |
https://www.engadget.com/zoom-automatic-closed-captions-free-users-172250054.html?src=rss
|
Zoom x s automatic closed captioning rolls out to all free usersZoom s live transcription feature is now widely available to all free users Previously it was a feature you had to pay to access but toward the start of the year Zoom said it would roll it out to everyone Now that it s here free users don t need to request access from the company if they need the tool for their meetings If you re in a call and want to request the host turn on live transcription you can do so by using the meeting toolbar At the moment the feature only works in English but support for more languages is on the way In September Zoom said it would offer automated closed captioning in a total of languages over the next year The company is also working on adding live translation for languages over that same time frame |
2021-10-25 17:22:50 |
海外TECH |
Engadget |
macOS Monterey is out now without SharePlay |
https://www.engadget.com/apple-macos-monterey-release-facetime-safari-shareplay-171102479.html?src=rss
|
macOS Monterey is out now without SharePlayApple has at long last released the latest major version of its Mac operating system macOS Monterey While it s perhaps a more modest update than in previous years there are some significant changes in some areas of the OS The redesigned Safari might be the most obvious transformation for many users Apple initially planned to remove the tabs bar before it thankfully saw sense and decided to leave it as is in a later developer preview The bar will match the color of the web page you re viewing and there are some new features such as Tab Groups Apple has overhauled FaceTime in macOS Monterey too It works a little more like other conference calling software in that you can start a call and then invite other people This includes folks using Android or Windows devices through the new FaceTime web app M Macs will also support spatial audio for FaceTime and other features through AirPods and AirPods Max Elsewhere macOS Monterey adds the Focus Modes seen in iOS and iPadOS Quick Notes Shortcuts and a new look Maps app Live Text Apple s answer to Google Lens is another new tool at macOS users disposal SharePlay the feature that lets people sync streaming videos and music with friends isn t available just yet You ll also need to wait a little longer for Universal Control which brings Mac and iPad together You can move your cursor from one to the other and drag files between devices SharePlay and Universal Control will arrive on macOS later this fall |
2021-10-25 17:11:02 |
海外TECH |
Engadget |
iOS 15.1 turns on SharePlay for Apple Fitness+ |
https://www.engadget.com/apple-fitness-group-workout-shareplay-available-now-170031703.html?src=rss
|
iOS turns on SharePlay for Apple Fitness Apple said in September that it was launching a feature called Group Workouts on Fitness that would use iOS s SharePlay tool for exercise sessions over FaceTime The company just announced that Group Workouts is available starting today so you can get up to friends together to follow along with the company s exercise or meditation videos nbsp To use the new features you ll need to update to iOS or iPadOS as well as watchOS which are available today Those who plan on watching the videos on their Apple TV will also need tvOS To start a Group Workout you ll need to first be on a FaceTime call go to the Fitness app then pick the video to follow As you all sweat it out Apple will display each person s metrics on their own screens When someone moves ahead on the Burn Bar which appears on specific workouts with more cardio activity or closes their rings everyone gets an alert so you can celebrate together nbsp SharePlay wasn t available when iOS launched earlier this year and during our testing of the iOS beta it was buggy and unstable The company just releases iOS today bringing the ability to SharePlay over FaceTime so you can watch movies and videos with your friends or just show them what s on your iPhone Fitness is also expanding to new countries From Nov rd Austria Brazil Colombia France Germany Indonesia Italy Malaysia Mexico Portugal Russia Saudi Arabia Spain Switzerland and the United Arab Emirates will be able to access the service In America those on UnitedHealthcare insurance can get a year of Fitness on their plans at no additional cost starting Nov st nbsp Update at pm ET This post was edited to add information about iOS s availability starting today |
2021-10-25 17:00:31 |
海外TECH |
CodeProject Latest Articles |
News Track - News Aggregator |
https://www.codeproject.com/Articles/5299293/News-Track-News-Aggregator
|
certain |
2021-10-25 17:34:00 |
海外TECH |
CodeProject Latest Articles |
NoisyCrypt |
https://www.codeproject.com/Tips/5316017/NoisyCrypt
|
color |
2021-10-25 17:11:00 |
海外科学 |
NYT > Science |
The Rich World’s Promise of $100 Billion in Climate Aid Inches Forward |
https://www.nytimes.com/2021/10/25/climate/100-billion-climate-aid-cop26.html
|
The Rich World s Promise of Billion in Climate Aid Inches ForwardDiplomats announced a plan to make good on an unkept promise of climate aid a key point of tension in upcoming global climate talks |
2021-10-25 17:33:25 |
金融 |
金融庁ホームページ |
企業会計審議会監査部会(第53回)議事次第 を公表しました。 |
https://www.fsa.go.jp/singi/singi_kigyou/siryou/kansa/20211026.html
|
企業会計 |
2021-10-25 18:00:00 |
ニュース |
BBC News - Home |
Frances Haugen says Facebook is 'making hate worse' |
https://www.bbc.co.uk/news/technology-59038506?at_medium=RSS&at_campaign=KARANGA
|
online |
2021-10-25 17:49:09 |
ニュース |
BBC News - Home |
Tory MPs defend votes after uproar over sewage proposals |
https://www.bbc.co.uk/news/uk-politics-59040175?at_medium=RSS&at_campaign=KARANGA
|
environment |
2021-10-25 17:11:36 |
ニュース |
BBC News - Home |
'Constantly cleaning' teenager becomes celebrity car washer |
https://www.bbc.co.uk/news/uk-england-manchester-59037084?at_medium=RSS&at_campaign=KARANGA
|
mctominay |
2021-10-25 17:05:38 |
ニュース |
BBC News - Home |
Afghanistan thrash Scotland by 130 runs at T20 World Cup |
https://www.bbc.co.uk/sport/cricket/59041235?at_medium=RSS&at_campaign=KARANGA
|
world |
2021-10-25 17:43:27 |
ニュース |
BBC News - Home |
'Everyone should be patient with me' - Raducanu seeks first WTA victory |
https://www.bbc.co.uk/sport/tennis/59042621?at_medium=RSS&at_campaign=KARANGA
|
x Everyone should be patient with me x Raducanu seeks first WTA victoryUS Open champion Emma Raducanu says everyone should be patient as she attempts this week to earn a first win since her Grand Slam success |
2021-10-25 17:08:00 |
ニュース |
BBC News - Home |
Britain's Evans upset by teenager Alcaraz in Vienna |
https://www.bbc.co.uk/sport/tennis/59042987?at_medium=RSS&at_campaign=KARANGA
|
carlos |
2021-10-25 17:24:52 |
ビジネス |
ダイヤモンド・オンライン - 新着記事 |
四国在住なら一度は行っておきたい高知一の神社 - 最強の神様100 |
https://diamond.jp/articles/-/285708
|
四国在住なら一度は行っておきたい高知一の神社最強の神様「仕事運」「金運」「恋愛運」「健康運」アップ「のご利益」の組み合わせからあなたの願いが叶う神様が必ず見つかる八百万やおよろずの神様から項目にわたって紹介。 |
2021-10-26 02:50:00 |
ビジネス |
ダイヤモンド・オンライン - 新着記事 |
【この三字熟語わかりますか?】偉□夫 (ヒント)『幽遊白書』の蔵馬のようなキャラの人。 - 世にも美しい三字熟語 |
https://diamond.jp/articles/-/285658
|
三字熟語 |
2021-10-26 02:45:00 |
IT |
週刊アスキー |
「iOS 15.1」配信開始 FaceTime経由でコンテンツを同時再生できる「SharePlay」に対応 |
https://weekly.ascii.jp/elem/000/004/073/4073078/
|
facetime |
2021-10-26 02:25:00 |
GCP |
Cloud Blog |
Model training as a CI/CD system: Part II |
https://cloud.google.com/blog/topics/developers-practitioners/model-training-cicd-system-part-ii/
|
Model training as a CI CD system Part IIIn the first part of the blog post we discussed how to monitor code changes and submit a TensorFlow Extended TFX pipeline to Vertex AI for execution We concluded that post with a few questions What if we wanted to maintain a schedule say hourly usually dependent on the use case to trigger the pipeline runs on Vertex AI What if we wanted a system such that during the experimentation phase whenever a new architecture is published as a Pub Sub topic the same pipeline needs to be executed but with different hyperparameters In this final half of the blog post we will tackle these situations and discuss some possible workarounds ApproachWe present a diagrammatic overview of the workflow we will realize in Figures and First we will use Cloud Build to Clone a repository from GitHub that contains all the code needed to build and compile a TFX pipeline ready for execution Build and push a custom Docker image that will be used to execute the pipeline Upload the compiled pipeline to a bucket on Google Cloud Storage GCS This is pictorially depicted in Figure Figure Workflow for generating the compiled TFX pipelineThe said pipeline is capable of taking runtime parameters as inputs This is particularly helpful when you d want to keep your pipeline components the same while performing different experiments with different sets of hyperparameters for example As a result you will reuse the pipeline and only create different experiments with varying hyperparameters Note that you could use this same pipeline for model retraining based on the availability of new data as well For the purpose of this post we will keep things simple and will pass the model hyperparameters number of epochs and optimizer learning rate as the pipeline runtime parameters In Figure we present the other half of our workflow that would take the compiled TFX pipeline and submit it to Vertex AI for execution Figure Workflow for executing a compiled TFX pipelineWe can either take the compiled pipeline spec and submit it to Vertex AI for execution or we can use a trigger mechanism for initiating the pipeline execution The latter case is particularly useful to handle situations when you want to connect the bridge between an event and pipeline execution Examples for this event include the arrival of new data new model architectures a new set of hyperparameters new preprocessing logic etc Based on events like these you d want to have a mechanism that automatically triggers or schedules the execution of your pipelines We will cover two workarounds One where we will publish a message to a topic on Pub Sub which a Cloud Function will be subscribed to This Cloud Function will then be responsible for initiating the pipeline execution For context the topic message will contain model hyperparameters and their values In the other solution we will schedule a job using Cloud Scheduler which will be responsible for triggering the Cloud Function for pipeline execution If you d like to know how to trigger a model training pipeline based on the arrival of new training data in a BigQuery database refer to this blog post Implementation detailsIn this section we discuss the technical details of the approaches we presented above We will not go too deep into the TFX related components and instead focus on the bits primarily at play here We will provide relevant references for readers interested to know more about the parts not covered here in great detail The code shown throughout this section is available in this repository We have used this Google Cloud repository as our main source of reference TFX pipeline and compilationFor the purpose of this post we will be using the TFX pipeline shown in this TFX tutorial It uses the Palmer Penguins dataset and trains a simple neural network in TensorFlow that can predict the species of a penguin The pipeline has the following TFX components CsvExampleGen Cloud AI Trainer and Pusher Discussing the pipeline bits by bits is out of scope for this post and we refer the readers to the original tutorial linked above The pipeline code is first hosted on a GitHub repository You can also host your code on BitBucket GitLab and so on or even Cloud Repositories Recall from Figure we will be compiling this pipeline and get it uploaded to a GCS Bucket Our pipeline should be able to take parameters at runtime and for that we will be using RuntimeParameters provided by TFX In our case these will be the number of epochs and learning rate for the optimizer we would use We can do this like so You can refer to the entire pipeline creation and compilation code from here But the story does not end here We still have to build and push a custom Docker image including all the utility scripts and any other Python packages This Docker image will eventually be used by Vertex AI to run the submitted pipeline On top of this we will also need to automate all the steps we discussed so far as a sort of build process for which we will Cloud Build Cloud Build operates with YAML specifications and our specification looks like so This specification file may be easier to read once you refer to the YAML specification document we linked above The variables prepended with “ are the ones we set when we call this YAML file for initiating the build process on Cloud Build After this specification file is configured we just need to initiate a run on Cloud Build SUBSTITUTIONS hold all of our variables relevant to the pipeline specification The entire build process is demonstrated in this notebook If the build is submitted successfully to Cloud Build it would appear like so on the dashboard Figure A demo build on Cloud BuildThe output of the build will be a compiled pipeline specification file in json that can be submitted to Vertex AI or other orchestrators for execution Pub Sub amp Cloud FunctionsWe now create a Pub Sub topic and deploy a Cloud Function that will be subscribed to this Pub Sub topic We will publish messages to this topic and as soon as this is done our Cloud Function will be triggered If you are confused with this bit don t worry it will get cleared up in a moment The Cloud Function will be responsible for parsing the message published to the Pub Sub topic and then triggering the pipeline run on Vertex AI and it looks like so Take note of the Python function trigger pipeline this is going to be important when deploying our Cloud Function You can find all the components of the Cloud Function from here To deploy the Cloud Function we first specify our environment variables and then perform the deployment of it Some important parameters from the gcloud functions deploy command trigger topic which is the name of our Pub Sub topic source is the directory where the relevant files specific to Cloud Function are hosted and entry point is the name of the Python function we discussed above For more context the directory to which source is pointing at contains the following files requirements txt specifying the Python packages needed for the Cloud Function main py containing the definition of trigger pipeline After the Cloud Function is deployed we can view it on a dashboard and get a number of important statistics Figure Cloud Function dashboardNow we can publish a message to the Pub Sub topic we had created earlier As soon as we do so the Cloud Function subscribed to the topic will get triggered and submit our pipeline with the parsed parameters to Vertex AI Our pipeline looks like so graphically Figure Graphical representation of our TFX pipeline on Vertex AIYou can find the entire integration with Pub Sub and Cloud Function in this notebook Cloud SchedulerThere are a number of situations where you want to run the pipeline periodically For example we might want to wait for a certain period of time until we get enough data Based on this we can perform batch predictions to extract embeddings or monitor the model performance This can be done by integrating Cloud Scheduler to the existing system Cloud Scheduler is a fully managed enterprise ready service to handle cron jobs and we can easily connect it to other GCP services such as Pub Sub There are two ways to create a job for Cloud Scheduler The first option is to use the gcloud CLI tool You need to get credentials for Cloud Scheduler for your service account Please follow this official document on how to create a service account and download the service account key Once you have downloaded the service account key you need to set up the environment variable pointing to the service account key JSON file The gcloud command will recognize the environment variable automatically The gcloud scheduler jobs create pubsub creates a periodic job to publish a Pub Sub topic with a given message The value of the schedule option should be set according to the standard cron job format For instance “ means run a task every three minutes Running a MLOps pipeline every three minutes doesn t reflect a real world situation but it is only set to demonstrate the behaviour of Cloud Scheduler The value of the topic option should be matched to the topic name that you have created for the Pub Sub previously The message body option lets you deliver additional data to the Pub Sub in JSON format In this example we have used it to push hyperparameters to the Cloud Function One thing to note when you use Jupyter Notebook is that the JSON format string should be encoded by json dumps method This makes sure the JSON format string isn t broken when injected in the CLI Figure TFX pipeline runs launched periodically on Vertex AIThe second option is to use Python API for Google Cloud Scheduler Actually there are a number of APIs supporting different programming languages since the API is built on top of the language neutral gRPC Protocol buffer Here we only demonstrate the usage in Python There are three main differences compared to gcloud command First the message should be encoded in utf This makes sure the message is encoded in bytes and data parameter in PubsubTarget requires the message to be bytes Second the name of Pub Sub topic should follow the projects lt PROJECT ID gt topics lt TOPIC NAME gt format Third the Scheduler Job name should follow the projects lt PROJECT ID gt locations lt REGION ID gt jobs lt JOB NAME gt format With these differences in mind the code above should be straight forward to understand For further details about the Python API please check out RPC specification and the official document on Python API Also you can find a complete demonstration of what is covered in this notebook CostFor this post the costing only stems from Vertex AI because the rest of the components like Pub Sub Cloud Functions have very minimal usage Each pipeline execution run on Vertex AI costs For training the model we chose a n standard machine type whose price is per hour and we did not use GPUs So as per our estimates the upper bound of the costs incurred should not be more than In any case you should use this GCP Price Calculator to get a better understanding of how your costing might come up after consuming the GCP services ConclusionIn this two part blog post we covered how we can treat model training as a CI CD system We covered various tools that are needed in order to accomplish that especially in the context of GCP We hope you gained some insights as to why this approach might be beneficial when you are operating at scale But this is only the tip of an iceberg With tools like Vertex AI the possibilities are practically endless and we encourage you to implement your own workflows on Vertex AI AcknowledgementsWe are grateful to the ML GDE program that provided GCP credits for supporting our experiments We sincerely thank Karl Weinmeister of Google for his help with the review Related ArticleModel training as a CI CD system Part IA machine learning system is essentially a software system So to operate with such systems scalably we need CI CD practices in place to Read Article |
2021-10-25 18:00:00 |
GCP |
Cloud Blog |
Google Cloud Next Rollup for Data Analytics |
https://cloud.google.com/blog/products/data-analytics/google-cloud-next-rollup-for-data-analytics/
|
Google Cloud Next Rollup for Data AnalyticsOctober rd this past Saturday was my th Googlevarsery and we are wrapping an incredible Google Next When I started in we had a dream of making BigQuery Intelligent Data Warehouse that would power every organization s data driven digital transformation This year at Next It was amazing to see Google Cloud s CEO Thomas Kurian kick off his keynote with CTO of WalMart Suresh Kumar talking about how his organization is giving its data the “BigQuery treatment AS I recap Next and reflect on our amazing journey over the past years I m so proud of the opportunity I ve had to work with some of the world s most innovative companies from Twitter to Walmart to Home Depot Snap Paypal and many others So much of what we announced at Next is the result of years of hard work persistence and commitment to delivering the best analytics experience for customers I believe that one of the reasons why customers choose Google for data is because we have shown a strong alignment between our strategy and theirs and because we ve been relentlessly delivering innovation at the speed they require Unified Smart Analytics Platform Over the past years our focus has been to build industries leading unified smart analytics platforms BigQuery is at the heart of this vision and seamlessly integrates with all our other services Customers can use BigQuery to query data in BigQuery Storage Google Cloud Storage AWS S Azure Blobstore various databases like BigTable Spanner Cloud SQL etc They can also use any engine like Spark Dataflow Vertex AI with BigQuery BigQuery automatically syncs all its metadata with Data Catalog and users can then run a Data Loss Prevention service to identify sensitive data and tag it These tags can then be used to create access policies In addition to Google services all our partner products also integrate with BigQuery seamlessly Some of the key partners highlighted at Next included Data Ingestion Fivetran Informatica amp Confluent Data preparation Trifacta DBT Data Governance Colibra Data Science Databricks Dataiku and BI Tableau PowerBI Qlik etc Planet Scale analytics with BigQueryBigQuery is an amazing platform and over the past years we have continued to innovate in various aspects Scalability has always been a huge differentiator for BigQuery BigQuery has many customers with more than petabytes of data and our largest customer is now approaching an exabyte of data Our large customers have run queries over trillions of rows But scale for us is not just about storing or processing a lot of data Scale is also how we can reach every organization in the world This is the reason we launched BigQuery Sandbox which enables organizations to get started with BigQuery without a credit card This has enabled us to reach tens of thousands of customers Additionally to make it easy to get started with BigQuery we have built integrations with various Google tools like Firebase Google Ads Google Analytics etc Finally to simplify adoption we now provide options for customers to choose whether they would like to pay per query buy flat rate subscriptions or buy per second capacity With our autoscaling capabilities we can provide customers best value by mixing flat rate subscription discounts with auto scaling with flex slots Intelligent Data Warehouse to empower every data analyst to become a data scientistBigQuery ML is one of the biggest innovations that we have brought to market over the past few years Our vision is to make every data analyst a data scientist by democratizing Machine learning of time is spent in moving prepping and transforming data for the ML platform This also causes a huge data governance problem as now every data scientist has a copy of your most valuable data Our approach was very simple We asked what if we could bring ML to data rather than taking data to an ML engine That is how BigQuery ML was born Simply write lines of SQL code and create ML models Over the past years we have launched many models like regression matrix factorization anomaly detection time series XGboost DNN etc These models are used by customers to solve complex business problems simply from segmentation recommendations time series forecasting package delivery estimation etc The service is very popular of our top customers are using BigQueryML today When you consider that the average adoption rate of ML AI is in the low is a pretty good result We announced tighter integration of BQML with Vertex AI Model explainability will provide the ability to explain the results of predictive ML classification and regression models by understanding how each feature contributes to the predicted result Also users will be able to manage compare and deploy BigQuery ML models in Vertex leverage Vertex Pipelines to train and predict BigQuery ML models Real time streaming analytics with BigQuery Customer expectations are changing and everyone wants everything in an instant according to Gartner by the end of of enterprises will shift from piloting to operationalizing AI driving a X increase in streaming data and analytics infrastructures The BigQuery s storage engine is optimized for real time streaming BigQuery supports streaming ingestion of s of millions of events in real time and there is no impact on query performance Additionally customers can use materialized views and BI Engine which is now GA on top of streaming data We guarantee always fast always fresh data Our system automatically updates MVs and BI Engine Many customers also use our PubSub service to collect real time events and process these through Dataflow prior to ingesting into BigQuery This is a streaming ETL pattern which is very popular Last year we announced PubSub Lite to provide customers with a lower price point and aTCO that is lower than any DIY Kafka deployment We also announced Dataflow Prime it is our next generation platform for Dataflow Big Data processing platforms have only focused on horizontal scaling to optimize workloads But we have seen new patterns and use cases like streaming AI where you may have a few steps in pipelines that perform data prep and then customers have to run a GPU based model Customers want to use different sizes and shapes of machines to run these pipelines in the most optimum manner This is exactly what Dataflow Prime does It delivers vertical auto scaling with the right fitting for your pipelines We believe this should lower costs for pipelines significantly With Datastream as our change data capture service built on Alooma technology we have solved the last key problem space for customers We can automatically detect changes in your operational databases like MySQL Postgres Oracle etc and sync them in BigQuery Most importantly all these products work seamlessly with each other through a set of templates Our goal is to make this even more seamless over next year Open Data Analytics with BigQueryGoogle has always been a big believer in Open Source initiatives Our customers love using various open source offerings like Spark Flink Presto Airflow etc With Dataproc amp Composer our customers have been able to run various of these open source frameworks on GCP and leverage our scale speed and security Dataproc is a great service and delivers massive savings to customers moving from on prem Hadoop environments But customers want to focus on jobs and not clusters That s why we launched Dataproc Serverless Spark GA offering at Next This new service adheres to one of our key design principles we started with make data simple Just like with BigQuery you can simply RUN QUERY With Spark on Google Cloud you simply RUN JOB ZDNet did a great piece on this I invite you to check it out Many of our customers are moving to Kubernetes and wanted to use that as the platform for Spark Our upcoming Spark on GKE offering will give the ability to deploy spark workloads on existing Kubernetes clusters But for me the most exciting capability we have is the ability to run Spark directly on BigQuery Storage BigQuery storage is highly optimized analytical storage By running Spark directly on it we again bring compute to data and avoid moving data to compute BigSearch to power Log AnalyticsWe are bringing the power of Search to BigQuery Customers already ingest massive amounts of log data into BigQuery and perform analytics on it Our customers have been asking us for better support for native JSON and Search At Next we announced the upcoming availability of both these capabilities Fast cross column search will provide efficient indexing of structured semi structured and unstructured data User friendly SQL functions let customers rapidly find data points without having to scan all the text in your table or even know which column the data resides in This will be tightly integrated with native JSON allowing customers to get BigQuery performance and storage optimizations on JSON as well as search on unstructured or constantly changing data structures Multi amp Cross Cloud AnalyticsResearch on multi cloud adoption is unequivocal ー of businesses in report having a multi cloud strategy We have always believed in providing customers choice to our customers and meeting them where they are It was clear that all our customers wanted us to take our gems like BigQuery to other clouds as their data was distributed on different clouds Additionally it was clear that customers wanted cross cloud analytics not multi cloud solutions that can just run in different clouds In short see all their data with a single pane of glass perform analysis on top of any data without worrying about where it is located avoid egress costs and finally perform cross cloud analysis across datasets on different clouds With BigQuery Omni we deliver on this vision with a new way of analyzing data stored in multiple public clouds Unlike competitors BigQuery Omni does not create silos across different clouds BigQUery provides a single control plane that shows an analyst all data they have access to across all clouds Analyst just writes the query and we send it to the right cloud across AWS Azure or GCP to execute it locally Hence no egress costs are incurred We announced BQ Omni GA for both AWS and Azure at Google Next and I m really proud of the team for delivering on this vision Check out Vidya s session and learn from Johnson and Johnson how they innovate in a multi cloud world Geospatial Analytics with BigQuery and Earth EngineWe have partnered with our Google Geospatial team to deliver GIS functionality inside BigQuery over the years At Next we announced that customers will be able to integrate Earth Engine with BigQuery Google Cloud s ML technologies and Google Maps Platform Think about all the scenarios and use cases your team s going to be able to enable sustainable sourcing saving energy or understanding business risks We re integrating the best of Google and Google Cloud together to again make it easier to work with data to create a sustainable future for our planet BigQuery as a Data Exchange amp Sharing PlatformBigQuery was built to be a sharing platform Today we have organizations sharing more than petabytes of data across organizations Google also brings more than public datasets to be used across various use cases In addition to this we are also bringing some of the most unique datasets like Google Trends to BigQuery This will enable organizations to understand in real time trends and apply to their business problems I am super excited about the Analytics Hub Preview announcement Analytics Hub will provide the ability for organizations to build private and public analytics exchanges This will include data insights ML Models and visualizations This is built on top of the industry leading security capabilities of BigQuery Breaking Data SilosData is distributed across various systems in the organization and making it easy to break the data silo and make all this data accessible to all is critical I m also particularly excited about the Migration Factory we re building with Informatica and the work we are doing for data movement intelligent data wrangling with players like Trifacta and FiveTran with whom we share over customers and growing Additionally we continue to deliver native Google service to help our customers We acquired Cask in and launched our self service Data Integration service in Data Fusion Now Fusion allows customers to create complex pipelines with just simple drag and drop This year we focused on unlocking SAP data for our customers We have launched various SAP connectors and accelerators to achieve this At GCP Next we also announced our BigQuery Migration service in preview Many of our customers are migrating their legacy data warehouses and data lakes to BigQuery BigQuery Migration Service provides end to end tools to simplify migrations for these customers And today to make migrations to BigQuery easier for even more customers I am super excited to announce the acquisition of CompilerWorks CompilerWorks Transpiler is designed from the ground up to facilitate SQL migration in the real world and will help our customers accelerate their migrations It supports migrations from over legacy enterprises data warehouses and we will be making it available as part of our BigQuery Migration service in the coming months Data Democratization with BigQueryOver the past years we have focused a lot on making it very easy to derive actionable insights from data in BigQuery Our priority has been to provide a strong ecosystem of partners that can provide you with great tools to achieve this but also deliver native Google capabilities With our BI engine GA announcement which we introduced in previewed earlier this year and showcased with tools like Microsoft PowerBI and Tableau is now available for all to play with BigQuery Data Studio are like peanut butter and Jelly They just work well together We launched BI Engine first with Data Studio and scaled it to all the users More than of our BigQuery customers use Data Studio Once we knew BI Engine works extremely well we now have made it an integral part of BigQuery API and launched it for all our internal and partner BI tools We announced GA for BI Engine at Next but we were already GA with Data Studio for the past years We recently moved the Data Studio team back into Google Cloud making the partnership even stronger If you have not used Data Studio I encourage you to take a look and get started for free today here Connected Sheets for BigQuery is one of my favorite combinations You can give every business user in your organization the ability to analyze billions of records using standard Google Sheets experience I personally use it everyday to analyze all our product data We acquired Looker in Feb with a vision of providing a semantic modeling layer to our customers with a governed BI solution Looker is tightly integrated with BigQuery including BigQuery ML Our latest partnership with Tableau where Tableau customers will soon be able to leverage Looker s semantic model enabling new levels of data governance while democratizing access to data Finally I have a dream that one day we will bring Google Assistant to your enterprise data This is the vision of Data QnA We are in early innings on this and we will continue to work hard to make this vision a reality Intelligent Data Fabric to unify the platformAnother important trend that shaped our market is the Data Mesh Earlier this year Starburst invited me to talk about this very topic We have been working for years on this concept and although we would love for all data to be neatly organized in one place we know that our customers reality is that it is not If you want to know more about this read about my debate on this topic with Fivetran s George Fraser az s Martin Casado and Databricks Ali Ghodsi Everything I ve learned from customers over my years in this field is that they don t just need a data catalog or a set of data quality and governance tools they need an intelligent data fabric That is why we created Dataplex whose general availability we announced at Next Dataplex enables customers to centrally manage monitor and govern data across data lakes data warehouses and data marts while also ensuring data is securely accessible to a variety of analytics and data science tools It lets customers organize and manage data in a way that makes sense for their business without data movement or duplication It provides logical constructs lakes data zones and assets which enable customers to abstract away the underlying storage systems to build a foundation for setting policies around data access security lifecycle management and so on Check out Prajakta Damle s session and learn from Deutsche Bank how they are thinking about a unified data mesh across distributed data Closing ThoughtsAnalysts have recognized our momentum and as I look back at this year I couldn t thank our customers and partners enough for the support they provided my team and I across our large Data Analytics portfolio in March Google BigQuery was named a Leader in The Forrester Wave Cloud Data Warehouse Q And in June Dataflow was named a Leader in The Forrester Wave Streaming Analytics Q report If you want to get a taste for why customers choose us over other hyperscalers or cloud data warehousing I suggest you watch the Data Journey series we ve just launched which documents the stories of organizations modernizing to the cloud with us The Google Cloud Data Analytics portfolio has become a leading force in the industry and I couldn t be more excited to have been part of it I do miss you my customers and partners and I m frankly bummed that we didn t get to meet in person like we ve done so many times before see a photo of my last in person talk before the pandemic but this Google Next was extra special so let s dive into the product innovation and their themes I hope that I will get to see you in person next time we run Google Next |
2021-10-25 17:30:00 |
コメント
コメントを投稿