投稿時間:2022-02-18 05:27:32 RSSフィード2022-02-18 05:00 分まとめ(30件)

カテゴリー等 サイト名等 記事タイトル・トレンドワード等 リンクURL 頻出ワード・要約等/検索ボリューム 登録日
AWS AWS Back to Basics: Using AWS Config and Conformance Packs to Optimize Your Database https://www.youtube.com/watch?v=dndoIEyBhJw Back to Basics Using AWS Config and Conformance Packs to Optimize Your DatabaseManaging your resource configurations and continuously evaluating configuration changes across your workload can be challenging Join Chloe as she walks through how you can leverage AWS Config and Conformance Packs to ensure your resources are properly managed and optimized for Production Workloads Additional Resources AWS Config Sample Conformance Packs  AWS Config Custom Rules AWS Systems Manager Documents Check out more resources for architecting in the AWS cloud AWS AmazonWebServices CloudComputing BackToBasics 2022-02-17 19:18:55
海外TECH Ars Technica Elon Musk tweets, then deletes, Holocaust joke https://arstechnica.com/?p=1835084 offensive 2022-02-17 19:39:40
海外TECH Ars Technica Tesla claims SEC is harassing Elon Musk to muzzle his criticism of government https://arstechnica.com/?p=1835047 tweets 2022-02-17 19:33:44
海外TECH Ars Technica Physical console games are quickly becoming a relatively niche market https://arstechnica.com/?p=1835058 analysis 2022-02-17 19:21:19
海外TECH MakeUseOf What Is an Air-Gapped Network? Why Should You Use One? https://www.makeuseof.com/what-is-an-air-gapped-network/ learn 2022-02-17 19:30:12
海外TECH DEV Community Why and how to transpile dependencies of your JavaScript application https://dev.to/cubejs/why-and-how-to-transpile-dependencies-of-your-javascript-application-3phf Why and how to transpile dependencies of your JavaScript applicationIf you re a web developer I m sure that you use bundlers e g Webpack Rollup or Parcel which transpile the JavaScript code of your application with Babel under the hood No doubt that you also use various dependencies to cut off the development time However I rarely see developers transpiling the code of their dependencies because everything seems to work fine without it right Wrong Here s why In this blog post I m sharing bits of my experience contributing to open source It s both my hobby and job at Cube where we create open source tools for building data applications Adoption of ESMBefore browsers and Node js got native support for ES modules an npm package could contain several variants of source code CommonJS variant that doesn t use modern features of JavaScript such as arrow functions It s compatible with most versions of Node js and browsers The source code location is specified in the main field of package json module variant that has ES imports exports but still doesn t use modern features of JavaScript Modules allow for tree shaking i e excluding unused code from a bundle The source code location is specified in the module field of package json see this discussion for details Obviously not every npm package follows this convention It s a choice that every author of a library makes on their own However it s quite common for browser side and universal libraries to be distributed in two variants Web browsers have natively supported ES modules for more than three years already and Node js supports them since version released in November That s why authors of libraries now include one more variant of source code for execution environments that natively support ES modules and many packages have completely removed the support for CommonJS Perks and features of ESMIt s important to understand that native ES modules are very much different than modules that have ES imports exports Here are a few reasons The way we are used to using imports exports will not work natively Such code is intended for further processing by a bundler or a transpiler Native ES modules don t support index file name and extension resolution so you have to specify them explicitly in the import path Won t work import from utils Works import from utils index js If an npm package has ES modules you have to add type module to package json and specify the source code location in the exports field see docs for details You can check this blog post by Axel Rauschmayer to learn more about ES modules Support for JavaScript featuresLook Since we know which versions of web browsers and Node js support ES modules we also know for sure which features of JavaScript we can use in the source code For instance all web browsers that support ES modules natively also support arrow functions so we don t need to avoid using them or use Babel to transpile them to regular functions Authors of libraries know that and ship the source code that leverages all modern features of JavaScript Transpilation of dependencies codeBut wait What can you do to make sure that your web application works in legacy browsers Also what to do if any of your application s dependencies use modern features of JavaScript that aren t supported by popular browsers In both cases you need to transpile the code of the dependencies Manual transpilationLet s assume that we re using webpack and babel loader Often the config would look like this module rules test jsx exclude node modules use loader babel loader options presets babel preset env targets defaults It s suggested in the documentation and usage examples for Babel and babel loader to exclude node modules from transpilation exclude node modules to optimize the performance By removing the exclude rule we ll enable the transpilation of dependencies code in exchange for the increased bundling time By providing a custom function as the exclude rule we also can transpile just a subset of all dependencies exclude gt node modules test amp amp node modules nanostores p limit test We can also select files by their filename extensions exclude gt node modules test amp amp babel js mjs es test Manual transpilation ーthe benchmarkLet s check how babel loader configuration affects the bundle size and bundling time Consider an application with three very different dependencies svelte uses modern features of JavaScript such as arrow functions p limit uses bleeding edge features of JavaScript such as private class fields axios regular ES code ConfigTranspilationCompatibilityBundle sizeBundling timeBasicーNo way to predict which web browsers will work with this bundle KB starget defaults and supports es moduleTo ES code Private class fields will be downgraded arrow functions and classes will remain as isModern browsers KB starget defaults with polyfillsTo ES codeAll browsers KB sYou can see that the total bundling time for modern browsers and all browsers with babel loader is s Please also note that the basic non transpiled bundle won t work with legacy browsers because of p limit By the way I also have a blog post that explains in detail how to build several bundles for different browsers Okay but what if you don t want to tinker with configs and specify files and packages to be transpiled manually Actually there s a readily available tool for that Transpilation with optimize pluginMeet optimize plugin for webpack by Jason Miller from Google developit It will take care of everything and even more It will transpile your application s source code and the code of all dependencies If needed it will generate two bundles for modern and legacy browsers using the module nomodule pattern On top of that it can also upgrade ES code to ES using babel preset modernize Let s see what optimize plugin will do to our example application with three dependencies ConfigTranspilationCompatibilityBundle sizeBundling timeBasicTo ES code Also to ES code with polyfillsAll browsers KB for modern browsers KB for legacy browsers including KB of polyfills sThe total bundling time with optimize plugin is s As you can see optimize plugin is not only faster than babel loader but it also produces a smaller bundle You can check my results using the code from my optimize plugin demo repository Why optimize plugin winsThe performance boost is possible because the code is analyzed and bundled only once After that optimize plugin transpiles it for modern and legacy browsers Smaller bundle size is possible thanks to babel preset modernize Chances are that you use ES features in your application s code but you never can predict which features are used in the source code of the dependencies Since optimize plugin works with the bundle that already has the code of all dependencies it can transpile it as a whole Here s how babel preset modernize works Consider this code snippet const items id price id price id price const sum items reduce function sum item const price item price return sum price console log sum After transpilation to ES we ll get this code const items id price id price id price const sum items reduce sum price gt sum price console log sum Here s what has changed A regular anonymous function was upgraded to an arrow function item price field access was replaced with the function argument destructuring Code size shrinked from to bytes Note that we applied only two transformations here but babel preset modernize can do a lot more What s next optimize plugin works really great but it still has some room for improvement Recently I ve contributed a few pull requests including the support for webpack If optimize plugin looks promising to you I encourage you to give it a try in your projects and maybe contribute some improvements as well Anyway starting today please always transpile the code of the dependencies whether with optimize plugin or not to make sure that you have full control over your application s compatibility with modern and legacy browsers Good luck Also if you are building a data application check out Cube It can help you build an API for metrics that you can use in your application within minutes 2022-02-17 19:33:41
海外TECH DEV Community How to send Solana transaction using Python ✨ https://dev.to/0xbolt/how-to-send-solana-transaction-using-python-1dii How to send Solana transaction using Python In this tutorial we are going to be seeing how we can send a transaction in the Solana blockchain using Python For this you do not need to know any complexity or what happens behind the scenes We will be using Solathon which is an easy to use feature rich SDK for Solana in Python Make sure you have Python or above installed now install the package using the following command pip install solathonFirst of all we would need to create a client which we will use to interact with the Solana s JSON RPC For testing purposes we will be using devnet as it doesn t require you to have real SOL balance The following code creates a client on devnet along with all the imports we will be requiring later on from solathon core instructions import transferfrom solathon import Client Transaction PublicKey Keypairclient Client Now we would need to define three things first of all the account which is sending the transaction second is the account which will receive the transaction and third is the amount of money we want to send in lamports which is the standard unit for amounts in transactions on Solana blockchain Have a look at the code below sender Keypair from private key you private key here receiver PublicKey receiver public key amount Here the sender needs to be a keypair which is a combination of public key and private key you would need to get your private key from your wallet in order to initialize your Keypair object Then we need the public key of the account whom we wish to send the transaction to finally we have the amount which we want to transfer in lamports You can use the sol to lamport function provided by Solathon for easy conversions Now each transaction requires you to have certain instructions defined which needs to be executed here we just need one instruction and that is transfer instruction transfer from public key sender public key to public key receiver lamports Now we need to initialize our transaction object with our instruction and signers which will be only the sender in this case transaction Transaction instructions instruction signers sender Finally we can send this transaction and print the results result client send transaction transaction print Transaction result result Here is the entire code from solathon core instructions import transferfrom solathon import Client Transaction PublicKey Keypairclient Client sender Keypair from private key your private key receiver PublicKey receiver public key amount instruction transfer from public key sender public key to public key receiver lamports transaction Transaction instructions instruction signers sender result client send transaction transaction print Transaction response result You can also check out the documentation for sending a transaction send transactionHow easy it that Hope so I was able to make things clear and the tutorial was helpful if there is something which still confuses you or you have any query in general feel free to ask any questions See you soon 2022-02-17 19:30:12
海外TECH DEV Community Welcome Thread - v164 https://dev.to/thepracticaldev/welcome-thread-v164-3b6l Welcome Thread v Welcome to DEV Leave a comment below to introduce yourself You can talk about what brought you here what you re learning or just a fun fact about yourself Reply to someone s comment either with a question or just a hello Great to have you in the community 2022-02-17 19:05:29
Apple AppleInsider - Frontpage News Apple is 'ungodly well-managed,' says Berkshire Hathaway vice chair https://appleinsider.com/articles/22/02/17/apple-is-ungodly-well-managed-says-berkshire-hathaway-vice-chair?utm_medium=rss Apple is x ungodly well managed x says Berkshire Hathaway vice chairBerkshire Hathaway Vice Chairman Charlie Munger has praised Apple s management calling the Cupertino tech giant one of the strong companies ーand adding it ll likely stay that way Credit Laurenz Heymann UnsplashApple represents Berkshire Hathaway s largest stock holding despite the fact that Munger and Berkshire chairman Warren Buffett often talk about their lack of expertise in the technology industry Read more 2022-02-17 19:33:43
海外TECH Engadget Ford's Mustang Mach-E ousts the Tesla Model 3 as Consumer Reports' top EV https://www.engadget.com/consumer-reports-top-ev-ford-mustang-mach-e-193104278.html?src=rss Ford x s Mustang Mach E ousts the Tesla Model as Consumer Reports x top EVTesla s Model has been Consumer Reports top EV choice for the past two years but the publication is ready to declare a new champion CR has revealed that Ford s Mustang Mach E has ousted the Model as its EV Top Pick The Mustang crossover is not only quot more practical quot according to editors but has better first year reliability and a quot far easier quot infotainment system that doesn t require multiple steps for basic tasks A better ride and reduced noise help too Ford s BlueCruise driver assist technology also gave the Mach E an edge thanks to a more effective drive monitoring system that now counts toward vehicle scores Tesla s Autopilot was docked for functioning while drivers look away Consumer Reports still recommended the Model thanks to its sports car like performance long range charging network and technology However the outlet couldn t recommend the Mach E s more direct rival the Model Y as an EV Top Pick Tesla s SUV like ride has quot much worse quot reliability than average vehicles in the lineup and is noticeably worse than the average rated Model This isn t going to please Tesla which has had a less than amicable relationship with Consumer Reports over the years The two have disputed test results and CR has temporarily pulled recommendations for some models However it also reflects lingering concerns about Tesla s reliability The EV producer has issued a string of recalls in recent months and owners have frequently reported build quality issues This might not have cost Tesla the lead by itself but it certainly didn t help the company s chances 2022-02-17 19:31:04
Cisco Cisco Blog Using NSO with pyATS Parsers to Check Operational State https://blogs.cisco.com/developer/nsowithpyatsparsers01 excel 2022-02-17 19:50:12
海外科学 NYT > Science Dr. Herbert Benson, Who Saw the Mind as Medicinal, Dies at 86 https://www.nytimes.com/2022/02/17/health/dr-herbert-benson-dead.html Dr Herbert Benson Who Saw the Mind as Medicinal Dies at A cardiologist and best selling author he was initially a skeptic before finding that a person can influence bodily health through Eastern style meditation 2022-02-17 19:32:55
海外科学 NYT > Science Expecting the Western Drought to End Soon? Not Likely, Forecasters Say. https://www.nytimes.com/2022/02/17/climate/noaa-weather-western-drought.html Expecting the Western Drought to End Soon Not Likely Forecasters Say Despite some wet weather last fall warm and dry conditions have settled in and are expected to continue through spring and beyond according to a new assessment 2022-02-17 19:37:43
医療系 医療介護 CBnews 介護職員処遇改善支援補助金のポイント-介護経営はどう変わる? 小濱道博が今を語る(23) https://www.cbnews.jp/news/entry/20220217164827 介護報酬 2022-02-18 05:00:00
ニュース BBC News - Home Storm Eunice: Rare red weather warning issued for parts of the UK https://www.bbc.co.uk/news/uk-60417263?at_medium=RSS&at_campaign=KARANGA eunice 2022-02-17 19:43:48
ニュース BBC News - Home Rangers earn sensational Europa League first-leg win at Borussia Dortmund https://www.bbc.co.uk/sport/football/60338532?at_medium=RSS&at_campaign=KARANGA Rangers earn sensational Europa League first leg win at Borussia DortmundRangers earn a sensational first leg victory at Borussia Dortmund to put themselves firmly in the driving seat for a Europa League last place 2022-02-17 19:40:30
ビジネス ダイヤモンド・オンライン - 新着記事 テレワークで意志疎通がうまくいかない会社に「足りない人材」とは - 上司は教えてくれない、仕事で本当に大切なこと https://diamond.jp/articles/-/296234 テレワークで意志疎通がうまくいかない会社に「足りない人材」とは上司は教えてくれない、仕事で本当に大切なことオミクロン株が猛威を振るう中、再びテレワーク率を上げる指示を出された企業も多いのではないでしょうか。 2022-02-18 04:55:00
ビジネス ダイヤモンド・オンライン - 新着記事 「脱炭素」で業績が悪化しそうな企業ランキング【医薬品】3位大塚HD、2位ツムラ、1位は? - ニッポン沈没 日本を見捨てる富裕層 https://diamond.jp/articles/-/294127 2022-02-18 04:50:00
ビジネス ダイヤモンド・オンライン - 新着記事 石炭「一時禁輸」のインドネシア、“オミクロン後”の経済回復リスク - 西濵徹の新興国スコープ https://diamond.jp/articles/-/296683 水を差す 2022-02-18 04:45:00
ビジネス ダイヤモンド・オンライン - 新着記事 「ネットスーパーの逆襲」が始まる?買い物の常識が変わるかもしれない3潮流 - 今週もナナメに考えた 鈴木貴博 https://diamond.jp/articles/-/296682 調査結果 2022-02-18 04:40:00
ビジネス ダイヤモンド・オンライン - 新着記事 アサリ偽装が「産地表示ルールの変更」では防げない理由とは - ニュース3面鏡 https://diamond.jp/articles/-/296681 だが、それだけではアサリ偽装はなくならないだろう。 2022-02-18 04:35:00
ビジネス ダイヤモンド・オンライン - 新着記事 中国に食われる日本の食品ブランド、「農産物」輸出1兆円超でも喜べない理由 - China Report 中国は今 https://diamond.jp/articles/-/296680 年、日本から輸出される農産物や食品はついに兆円の大台を突破したが、日本品種が出回る中国市場への日本の食品ブランドの輸出はますます困難な状況になっている。 2022-02-18 04:30:00
ビジネス ダイヤモンド・オンライン - 新着記事 本はとにかく「買ったその日のうち」に読み始めたほうがいいワケ - ニュース3面鏡 https://diamond.jp/articles/-/295481 youtuber 2022-02-18 04:25:00
ビジネス ダイヤモンド・オンライン - 新着記事 強気相場でも投資初心者が「大損」してしまう理由 - 初心者のための「老後資金」対策講座 https://diamond.jp/articles/-/296291 関係 2022-02-18 04:20:00
ビジネス ダイヤモンド・オンライン - 新着記事 数値目標では社員がやる気にならない理由、「行動スイッチの入れ方」のコツとは - ニュース3面鏡 https://diamond.jp/articles/-/295607 売り上げ 2022-02-18 04:15:00
ビジネス ダイヤモンド・オンライン - 新着記事 Apple Watchの「別の顔」、ビジネスシーンでもこんなに使える! - ビジネスを変革するテクノロジー https://diamond.jp/articles/-/296679 applewatch 2022-02-18 04:10:00
ビジネス ダイヤモンド・オンライン - 新着記事 クセになる「おっさんビジネス用語」どれだけ使ってる?エイヤ・全員野球・よしなに… - News&Analysis https://diamond.jp/articles/-/296677 newsampampanalysis 2022-02-18 04:05:00
ビジネス 東洋経済オンライン 金融政策の限界を素直に認めたラガルドECB総裁 需要が低調なら「インフレを潰す」ことにリスク | 市場観測 | 東洋経済オンライン https://toyokeizai.net/articles/-/512301?utm_source=rss&utm_medium=http&utm_campaign=link_back 東洋経済オンライン 2022-02-18 04:50:00
ビジネス 東洋経済オンライン ニュータウンの中心「多摩センター」と街の半世紀 入居開始時は「陸の孤島」、今は3路線が乗り入れ | 駅・再開発 | 東洋経済オンライン https://toyokeizai.net/articles/-/512117?utm_source=rss&utm_medium=http&utm_campaign=link_back 乗り入れ 2022-02-18 04:30:00
GCP Cloud Blog Cloud Spanner myths busted https://cloud.google.com/blog/products/databases/cloud-spanner-myths-busted/ Cloud Spanner myths bustedIntro to Cloud SpannerCloud Spanner is an enterprise grade globally distributed externally consistent database that offers unlimited scalability and industry leading availability It requires no maintenance windows and offers a familiar PostgreSQL interface It combines the benefits of relational databases with the unmatched scalability and availability of non relational databases  As organizations modernize and simplify their tech stack Spanner provides a unique opportunity to transform the way they think about and use databases as part of building new applications and customer experiences But choosing a database for your workload can be challenging there are so many options in the market and each one has a different onboarding and operating experience At Google Cloud we know it s hard to navigate this choice and are here to help you In this blog post I want to bust the seven most common misconceptions that I regularly hear about Spanner so that you can confidently make your decision Myth Only use Spanner if you have a massive workloadThe truth is that Spanner powers Google s most popular globally available products like YouTube Drive and Gmail and has enabled many large scale transformations including that of Uber Niantic and Sharechat It is also true that Spanner processes more than Billion queries per second at peak At the same time many customers also use Spanner for their smaller workloads both in terms of transactions per second and storage size for availability and scalability reasons For example Google Password Manager has small workloads that run on Spanner These customers cannot tolerate downtime require high availability to power their applications and seek scale insurance for future growth scenarios Limitless scalability with the highest availability is critical in many industry verticals such as gaming and retail especially when a newly launched game goes viral and becomes an overnight success or when a retailer has to handle a sudden surge in traffic due to a Black Friday Cyber Monday sale   Regardless of workload size every customer on the journey to the cloud wants the benefits of scalability and availability while reducing the operational burden and the costs associated with patching upgrades and other maintenance Myth Spanner is too expensiveThe truth is when looking at the cost of a database it is better to consider Total Cost of Ownership TCO and the value it offers rather than the raw list price We deliver significant value to our customers starting at this price including critical things like availability price performance and reduced operational costs  Availability Spanner provides high availability and reliability by synchronously replicating data When it comes to Disaster Recovery Spanner offers RPO and RTO for zonal failures in case of a regional instance and regional failure in case of multi regional instances Less downtime more revenue Price performance Spanner offers one of the industry s leading price performance ratios which makes it a great choice if you are running a demanding performance sensitive application Great customer experiences require consistent optimal latencies Reduced operational cost With Spanner customers enjoy zero downtime upgrades and schema changes and no maintenance windows Sharding is automatically handled so the challenges associated with scaling up traditional databases don t exist Spend more time innovating and less time administering Security amp Compliance By default Spanner already offers encryption for data in transit via its client libraries and for data at rest using Google managed encryption keys CMEK support for Spanner lets you now have complete control of the encryption keys Spanner also provides VPC Service Controls support and has compliance certifications and necessary approvals so that it can be used for workloads requiring ISO PCI DSS SOC HIPAA and FedRAMP With Spanner you have peace of mind knowing that your data s security availability and reliability won t be compromised And best of all with the introduction of Granular Instance Sizing you can now get started for as little as month and unlock the tremendous value spanner offers Pro tip Use the auto scaler to right size your Spanner instances Take advantage of TTL to reduce the amount of data stored Myth You have to make a trade off between scale consistency and latencyThe truth is depending on the use case and instance configuration users can use Spanner such that they don t have to pick between consistency latency and scale To provide strong data consistency Spanner uses a synchronous Paxos based replication scheme in which replicas acknowledge every write request A write is committed when a majority of the replicas e g out of called a quorum agree to commit the write In the case of regional instances the replicas are within the region and hence the writes are faster than in the case of multi region instances where the replicas are distributed across multiple regions In the latter case forming a quorum on writes can result in slightly higher latency Nevertheless Spanner multi regions are carefully designed in geographical configurations that ensure that the replicas can communicate fast enough and write latencies are acceptably low A read can be served strong by default or stale A strong read is a read at a current timestamp and is guaranteed to see all the data that has been committed up until the start of the read A stale read is a read executed at a timestamp in the past In case of a strong read the serving replica ​​will guarantee that you will see all data that has been committed up until the start of the read In some cases this means that the serving replica has to contact the leader to ensure that it has the latest data In case of a multi region instance where the read is served from a non leader replica this would mean that read latency can be slightly higher than if it was served from a leader region Stale reads are performed over data that was committed at a timestamp in the past and can therefore be served at very low latencies by the closest replica that is caught up until that timestamp If your application is latency sensitive stale reads may be a good option and we recommend using a stale read value of seconds  Myth Spanner does not have a familiar interfaceThe truth is that Spanner offers the flexibility to interact with the database via a SQL dialect based on ANSI standard as well as via a REST or gRPC API interface which are optimized for performance and ease of use In addition to Spanner s interface we recently introduced a PostgreSQL interface for Spanner that leverages the ubiquity of PostgreSQL to meet development teams using an interface that they are familiar with The PostgreSQL interface provides a rich subset of the open source PostgreSQL SQL dialect including common query syntax functions and operators It supports a core collection of open source PostgreSQL data types DDL syntax and information schema views You get the PostgreSQL familiarity and relational semantics at Spanner scale  Learn more about our PostgreSQL interface here Myth The only way to get observability data is via the Spanner Console​​The truth is that Spanner client libraries support OpenCensus Tracing and Metrics which gives insight into the client internals and aids in debugging production issues For instance client side traces and metrics include sessions and transactions related information  Spanner also supports the OpenTelemetery receiver which provides an easy way for you to process and visualize metrics from Cloud Spanner System tables and export these to the Application Monitoring APM tool of your choice This could be either an open source combination of a time series database like Prometheus coupled with a Grafana dashboard or it could be a commercial offering like Splunk Datadog Dynatrace NewRelic or AppDynamics We ve also published reference Grafana dashboards so that you can debug the most common user journeys such as “Why is my tail latency high or “Why do I see a CPU spike when my workload did not change Here is a sample docker service to show how the Cloud Spanner receiver can work with Prometheus exporter and Grafana dashboards We are continuing to embrace open standards and continuing to integrate with our partner ecosystem We also continue to evolve the observability experience offered by the Google console so that our customers get the best experience wherever they are  Myth Spanner is only for global workloads requiring copies in multiple regions The truth is that while Spanner offers a range of multi region instance configurations it also offers regional configuration in each GCP region Each regional node is replicated in zones within the region while a multi regional node is replicated at least times across multiple regions A regional configuration offers nines of availability and protection against zonal failures Typically multi regional instance configurations are indicated if your application runs workloads in multiple geographical locations or your business needs of availability and protection against regional failures Learn more here Myth Spanner schema changes require expensive locksThe truth is that Spanner never has table level locks Spanner uses a multi version concurrency control architecture to manage concurrent versions of schema and data allowing ad hoc and online qualified schema changes that do not require any downtime additional tools migration pipelines or complex rollback backup plans When issuing a schema update you can continue writing and reading from the database without interruption while Spanner backfills the update whether you have rows or billon rows in your table The same mechanism can be used for Point in time recovery PITR and snapshot queries using stale reads to restore both schema and the state of data at a given query condition and timestamp up to a maximum of seven days Now that we ve learned the truth about Cloud Spanner I invite you to get started visit our website Related ArticleImproved troubleshooting with Cloud Spanner introspection capabilitiesCloud native database Spanner has new introspection capabilities to monitor database performance and optimize application efficiency Read Article 2022-02-17 19:30:00

コメント

このブログの人気の投稿

投稿時間:2021-06-17 05:05:34 RSSフィード2021-06-17 05:00 分まとめ(1274件)

投稿時間:2021-06-20 02:06:12 RSSフィード2021-06-20 02:00 分まとめ(3871件)

投稿時間:2020-12-01 09:41:49 RSSフィード2020-12-01 09:00 分まとめ(69件)