投稿時間:2021-09-17 02:28:34 RSSフィード2021-09-17 02:00 分まとめ(35件)

カテゴリー等 サイト名等 記事タイトル・トレンドワード等 リンクURL 頻出ワード・要約等/検索ボリューム 登録日
AWS AWS Architecture Blog Disaster Recovery (DR) for a Third-party Interactive Voice Response on AWS https://aws.amazon.com/blogs/architecture/disaster-recovery-dr-for-a-third-party-interactive-voice-response-on-aws/ Disaster Recovery DR for a Third party Interactive Voice Response on AWSVoice calling systems are prevalent and necessary to many businesses today They are usually designed to provide a × helpline support across multiple domains and use cases Reliability and availability of such systems are important for a good customer experience The thoughtful design of a cost optimized solution will allow your business to sustain the system … 2021-09-16 16:37:49
AWS AWS Partner Network (APN) Blog How Tamr Optimized Amazon EMR Workloads to Unify 200 Billion Records 5x Faster than On-Premises https://aws.amazon.com/blogs/apn/how-tamr-optimized-amazon-emr-workloads-to-unify-200-billion-records-5x-faster-than-on-premises/ How Tamr Optimized Amazon EMR Workloads to Unify Billion Records x Faster than On PremisesGlobal business leaders recognize the value of advanced and augmented big data analytics over various internal and external data sources However technical leaders also face challenges capturing insights from data silos without unified master data Learn how migrating Tamr s data mastering solutions from on premises to AWS allowed a customer to process billions of records five times faster with fully managed Amazon EMR clusters 2021-09-16 16:50:24
AWS AWS - Webinar Channel The creation of a well-oiled migration machine - AWS Virtual Workshop https://www.youtube.com/watch?v=lpnnCZLgYvQ The creation of a well oiled migration machine AWS Virtual WorkshopThe art of smoothly migrating workloads to AWS involves a number of key work streams and components to come together as one machine With a clearly defined path from raw material inputs Portfolio data Platform Designs Operational capabilities etc to planning and the creation of precise runbooks that can be used to orchestrate the migration events all working towards the goal of the seamless migration of business services to AWS This virtual learning will provide practical examples of how to run and manage the portfolio requirements of a program and how this data is used to plan and generate runbooks for the execution of migration events providing you with tools and knowledge that can help you start your journey to AWS Learning objectives Learn about the key components that make up a successful migration program Understand key data and tooling requirements for migration Learn how to construct migration runbooks and manage scope 2021-09-16 16:34:59
python Pythonタグが付けられた新着投稿 - Qiita matplotlibの2DヒストグラムでX軸、Y軸、Z軸をログ表示する方法 https://qiita.com/yamadasuzaku/items/314280d9b35a81433801 matplotlibのDヒストグラムでX軸、Y軸、Z軸をログ表示する方法使い所matplotlibのDヒストグラムで、全軸XYZをログ表示する方法を紹介する。 2021-09-17 01:29:41
Program [全てのタグ]の新着質問一覧|teratail(テラテイル) CSS mask-imageがかからない https://teratail.com/questions/359921?rss=all 2021-09-17 01:56:30
Program [全てのタグ]の新着質問一覧|teratail(テラテイル) vscodeでのデバッグ(go言語) https://teratail.com/questions/359920?rss=all centos 2021-09-17 01:09:26
Program [全てのタグ]の新着質問一覧|teratail(テラテイル) サーバー上でのpwdの出力を, ローカルのクリップボードに入れたい https://teratail.com/questions/359919?rss=all サーバー上でのpwdの出力をローカルのクリップボードに入れたい前提・実現したいことlinuxのサーバー上でのpwdの出力をnbspローカルのクリップボードに入れたいです。 2021-09-17 01:03:54
Docker dockerタグが付けられた新着投稿 - Qiita Docker上のUbuntu 20.04 にSMTPクライアントのmsmtpを導入して、GmailのSMTPサーバを使ったメール送信の動作確認をする https://qiita.com/yohama/items/b41a9bdb413010671819 aptupdateampampaptdistupgradeymsmtpの導入と動作確認まずは、msmtpをaptで導入します。 2021-09-17 01:29:27
Ruby Railsタグが付けられた新着投稿 - Qiita ドロップダウンでboolean型の値を扱う方法 https://qiita.com/yukikok/items/592b4d57a2ec6ef35781 ドロップダウンでboolean型の値を扱う方法基本的にboolean型の値をドロップダウンで表示することはないと思うんですが、仕事でそうしないといけない状況があり、手こずったので書き留めておきます。 2021-09-17 01:06:41
技術ブログ Developers.IO How To Static Website EP1: แนะนำก่อนเริ่มต้นใช้งาน https://dev.classmethod.jp/articles/how-to-static-website-ep1-recommended-before-getting-started/ How To Static Website EP แนะนำก่อนเริ่มต้นใช้งานสวัสดีค่ะทุกคนวันนี้พิชชาก็มาพร้อมกับบทความใหม่ที่จะแนะนำวิธีการสร้างเว็บไซต์แบบStatic โดยใช้เครื่องมือจากทา 2021-09-16 16:30:23
海外TECH DEV Community Standard Banking Demo: Discovering Entando CMS Components https://dev.to/entando/standard-banking-demo-discovering-entando-cms-components-4o9j Standard Banking Demo Discovering Entando CMS ComponentsHey my fellow developers Here s the last blog post in the Standard Banking Demo series After a deep dive into the microservices and micro frontends we are now discovering the CMS components that make up this banking application and how we can manage our content to provide a great user experience Content Management The Standard Demo Banner Compose Components to Display a BannerIn the Standard Banking Demo banners are defined to display pieces of content in pages rendered in a modern website style We will use this content type to explain how an Entando application leverages CMS components to build composable applications CMS component architectureTo manage content we need to know how to define it the Type how to display it the Model and how to create a new instance the Content Content TypeThe ContentType defines the attributes available when adding a new instance of content The type is defined by a name and a code and defines fields and rules to apply Each attribute has a code a name and a type Attributes can be mandatory and be used to filter content Content TemplateThe Content Template defines how a ContentType is displayed A ContentType can be displayed in different ways by creating additional Content Templates ContentThe Content refers to a piece or instance of content and is based on a ContentType The Content defines the values of the attributes for a ContentType AssetsA collection of assets you can use in a piece of content and share across multiple content entries e g an image When a page is designed the Content widget is configured to render a piece of content based on the Content Template Compose components in the App BuilderIn the Page Designer this content is placed in a frame as a “Content widget Here in red the Content widget is used to display the main banner on the homepage This widget is configured to display the content you want to display along with the desired content template In this case the Home SD page contains multiple Content widgets that all use the same content type SDB Although the Content Type is the same each section is rendered differently by choosing different Content Templates There is a content template for displaying the main banner another for displaying a light background and another to render an accordion inside the banner The content to display can be selected from the Settings menu in the kebab menu Then we can select the content we want to display displayed below in blue and the template we want to use for this instance of content in orange Only a template linked to the same content type can be used All of the CMS components are defined as code in the Standard Banking Demo CMS Components as Code The Standard Demo Banner Content Typecode SDBname Standard Demo Bannersstatus attributes code title type Text names en title roles disablingCodes mandatory true listFilter false indexable true validationRules minLength null maxLength null regex null rangeStartString null rangeEndString null rangeStartStringAttribute null rangeEndStringAttribute null equalString null equalStringAttribute null rangeStartDate null rangeEndDate null rangeStartDateAttribute null rangeEndDateAttribute null equalDate null equalDateAttribute null rangeStartNumber null rangeStartNumberAttribute null rangeEndNumber null rangeEndNumberAttribute null equalNumber null equalNumberAttribute null ognlValidation null The Standard Demo Banner TemplateThe template references the content type through the contentType field SDBid contentType SDBdescription Main BannercontentShape lt div class main banner gt lt div class row gt lt div class col lg col xs main banner center gt lt div class text wrapper gt lt h gt content title text lt h gt lt p gt content subtitle text lt p gt lt div gt lt div gt lt div gt lt div gt The Standard Demo Banner ContentThe content references the content type through the typeCode field SDBid SDBtypeCode SDBdescription main bannermainGroup freestatus PUBLICattributes code title value null values en A Better Way to Bank elements compositeelements listelements The Standard Demo HomepageThe page template defines the widgets on the page like the Content widget code content viewerThe Content widget is then configured to display the top banner using the contentId SDB and the template modelId code homepagesdparentCode homepagetitles en Home SD it Home SDpageModel seed homeownerGroup freejoinGroups displayedInMenu trueseo falsecharset utfstatus publishedwidgets code Brand Logo config null pos code Login buttons config null pos code content viewer config contentDescription main banner modelId ownerGroup free contentId SDB joinGroups ConclusionThis series is ending with the CMS Standard Banking Demo overview it provides a working example of how traditional micro frontends can be composed alongside CMS components The banner example is great to understand how a given content type can be used in multiple ways to render different content using the same attributes and improve your capacity to deliver content quickly and easily User editable content can also be exported using the bundle export import feature and managed as code along with micro frontends and microservices across multiple environments or clusters Now it s time to play with the Standard Banking Demo Be sure to follow our documentation and reach out on the Entando forum for feedback 2021-09-16 16:53:02
海外TECH DEV Community Standard Banking Demo: JHipster Generated Microservices and Micro Frontends https://dev.to/entando/standard-banking-demo-jhipster-generated-microservices-and-micro-frontends-265o Standard Banking Demo JHipster Generated Microservices and Micro FrontendsHi my fellow developers The second episode of the Entando Standard Banking Demo series brings us to discover how to call JHipster generated microservices using micro frontends Taking this one step beyond a hello world app the Standard Banking Demo helps to understand how a complex distributed application works with Entando This article will detail the code architecture the entities definition from the domain level to the top API level and finally how the frontend code leverages it Let s deep dive into the code IntroductionEntando defines component types to describe the different parts of your applications as code Every part of an Entando application can be defined using components such as the assets content pages plugins and widgets that make up your application A microservice is deployed as a plugin using an image to run a container on Kubernetes as a pod A micro frontend is deployed as a widget using a javascript web component and is included in a page These components can be created from scratch However Entando provides a JHipster blueprint called Entando Component Generator ECG and speeds up the coding time by scaffolding your component creating the data layer domain and repository the business layer including the service and data transfer objects and the API which can be consumed with HTTP requests By default the ECG generates micro frontends per entity to view edit and list the data These micro frontends cover the CRUD operations and can be customized to meet your needs For advanced use cases you can also implement your own micro frontends This article will cover the banking microservice and the micro frontend that uses its API Banking MicroserviceThe banking app will be used along this blog post to demonstrate what we can find in the Standard Banking Demo You can find the code for the Standard Banking Demo bundle here You can find the code for the banking microservice here Backend Code Focus on the CreditCard EntityThe backend contains entities defined using the JHipster Domain Language In this article we will focus on the Creditcard Entity For this Entity you can find several generated classes The domain layerThe lowest level is the domain object in the org entando demo banking domain package Entity Table name creditcard Cache usage CacheConcurrencyStrategy READ WRITE public class Creditcard implements Serializable private static final long serialVersionUID L Id GeneratedValue strategy GenerationType SEQUENCE generator sequenceGenerator SequenceGenerator name sequenceGenerator private Long id Column name account number private String accountNumber Column name balance precision scale private BigDecimal balance Column name reward points private Long rewardPoints Column name user id private String userID The Repository is the interface that extends a Spring Data interface to retrieve content from the Database and defines requests that can be used for this given entity it can be found under the org entando demo banking repository package Repositorypublic interface CreditcardRepository extends JpaRepository lt Creditcard Long gt JpaSpecificationExecutor lt Creditcard gt Optional lt Creditcard gt findByUserID String userID The Service layerThe Service layer contains the business code for this entity Basically the service is placed just between the data and the API layer Here we have the Service Class that implements the interface Service Transactionalpublic class CreditcardServiceImpl implements CreditcardService private final Logger log LoggerFactory getLogger CreditcardServiceImpl class private final CreditcardRepository creditcardRepository public CreditcardServiceImpl CreditcardRepository creditcardRepository this creditcardRepository creditcardRepository Override public Creditcard save Creditcard creditcard log debug Request to save Creditcard creditcard return creditcardRepository save creditcard Override Transactional readOnly true public Page lt Creditcard gt findAll Pageable pageable log debug Request to get all Creditcards return creditcardRepository findAll pageable Override Transactional readOnly true public Optional lt Creditcard gt findOne Long id log debug Request to get Creditcard id return creditcardRepository findById id Override public void delete Long id log debug Request to delete Creditcard id creditcardRepository deleteById id Override Transactional readOnly true public Optional lt Creditcard gt findOneWithUserID String userID log debug Request to get Creditcard with userID userID return creditcardRepository findByUserID userID Next we have the QueryService for advanced search requests using Spring Data Specifications Service Transactional readOnly true public class CreditcardQueryService extends QueryService lt Creditcard gt private final Logger log LoggerFactory getLogger CreditcardQueryService class private final CreditcardRepository creditcardRepository public CreditcardQueryService CreditcardRepository creditcardRepository this creditcardRepository creditcardRepository Transactional readOnly true public List lt Creditcard gt findByCriteria CreditcardCriteria criteria log debug find by criteria criteria final Specification lt Creditcard gt specification createSpecification criteria return creditcardRepository findAll specification Transactional readOnly true public Page lt Creditcard gt findByCriteria CreditcardCriteria criteria Pageable page log debug find by criteria page criteria page final Specification lt Creditcard gt specification createSpecification criteria return creditcardRepository findAll specification page Transactional readOnly true public long countByCriteria CreditcardCriteria criteria log debug count by criteria criteria final Specification lt Creditcard gt specification createSpecification criteria return creditcardRepository count specification protected Specification lt Creditcard gt createSpecification CreditcardCriteria criteria Specification lt Creditcard gt specification Specification where null if criteria null if criteria getId null specification specification and buildSpecification criteria getId Creditcard id if criteria getAccountNumber null specification specification and buildStringSpecification criteria getAccountNumber Creditcard accountNumber if criteria getBalance null specification specification and buildRangeSpecification criteria getBalance Creditcard balance if criteria getRewardPoints null specification specification and buildRangeSpecification criteria getRewardPoints Creditcard rewardPoints if criteria getUserID null specification specification and buildStringSpecification criteria getUserID Creditcard userID return specification And the Data Transfer Object DTO to store the criteria that s passed as an argument to the QueryService public class CreditcardCriteria implements Serializable Criteria private static final long serialVersionUID L private LongFilter id private StringFilter accountNumber private BigDecimalFilter balance private LongFilter rewardPoints private StringFilter userID public CreditcardCriteria public CreditcardCriteria CreditcardCriteria other this id other id null null other id copy this accountNumber other accountNumber null null other accountNumber copy this balance other balance null null other balance copy this rewardPoints other rewardPoints null null other rewardPoints copy this userID other userID null null other userID copy The Web layerThe Web layer for the microservice aka the Rest layer is the exposed part of the application defining Rest endpoints to be consumed by clients such as micro frontends The request sent to an endpoint will be caught by the web layer and according to the code logic calls will be made to the Service and indirectly to the Domain layer RestController RequestMapping api Transactionalpublic class CreditcardResource private final Logger log LoggerFactory getLogger CreditcardResource class private static final String ENTITY NAME creditcardCreditcard Value jhipster clientApp name private String applicationName private final CreditcardService creditcardService private final CreditcardQueryService creditcardQueryService public CreditcardResource CreditcardService creditcardService CreditcardQueryService creditcardQueryService this creditcardService creditcardService this creditcardQueryService creditcardQueryService PostMapping creditcards public ResponseEntity lt Creditcard gt createCreditcard RequestBody Creditcard creditcard throws URISyntaxException log debug REST request to save Creditcard creditcard if creditcard getId null throw new BadRequestAlertException A new creditcard cannot already have an ID ENTITY NAME idexists Creditcard result creditcardService save creditcard return ResponseEntity created new URI api creditcards result getId headers HeaderUtil createEntityCreationAlert applicationName true ENTITY NAME result getId toString body result PutMapping creditcards public ResponseEntity lt Creditcard gt updateCreditcard RequestBody Creditcard creditcard throws URISyntaxException log debug REST request to update Creditcard creditcard if creditcard getId null throw new BadRequestAlertException Invalid id ENTITY NAME idnull Creditcard result creditcardService save creditcard return ResponseEntity ok headers HeaderUtil createEntityUpdateAlert applicationName true ENTITY NAME creditcard getId toString body result GetMapping creditcards public ResponseEntity lt List lt Creditcard gt gt getAllCreditcards CreditcardCriteria criteria Pageable pageable log debug REST request to get Creditcards by criteria criteria Page lt Creditcard gt page creditcardQueryService findByCriteria criteria pageable HttpHeaders headers PaginationUtil generatePaginationHttpHeaders ServletUriComponentsBuilder fromCurrentRequest page return ResponseEntity ok headers headers body page getContent GetMapping creditcards count public ResponseEntity lt Long gt countCreditcards CreditcardCriteria criteria log debug REST request to count Creditcards by criteria criteria return ResponseEntity ok body creditcardQueryService countByCriteria criteria GetMapping creditcards id public ResponseEntity lt Creditcard gt getCreditcard PathVariable Long id log debug REST request to get Creditcard id Optional lt Creditcard gt creditcard creditcardService findOne id return ResponseUtil wrapOrNotFound creditcard DeleteMapping creditcards id public ResponseEntity lt Void gt deleteCreditcard PathVariable Long id log debug REST request to delete Creditcard id creditcardService delete id return ResponseEntity noContent headers HeaderUtil createEntityDeletionAlert applicationName true ENTITY NAME id toString build GetMapping creditcards user userID public ResponseEntity lt Creditcard gt getCreditcardByUserID PathVariable String userID log debug REST request to get Creditcard by user ID userID Optional lt Creditcard gt creditcard creditcardService findOneWithUserID userID return ResponseUtil wrapOrNotFound creditcard The Micro FrontendsYou can find all the micro frontends under the ui widgets folder Each of them matches with a business use case and is implemented as Web Components and consumes APIs from the banking microservice Architecture for banking microservice and micro frontends We will focus on the Dashboard Card React instance that uses the Banking API and the CreditCard endpoints to display the amount and points for a credit card You can find it under the ui widgets banking widgets dashboard card react folder Frontend Code Focus on the CreditCard ImplementationThe micro frontend is generic enough to handle more than one kind of transaction exposed by the Banking API Checking Savings and Credit Cards Basically the same frontend component can be used multiple times and be configured to display different sets of data Declare a React application as a Custom ElementThe custom element is a part of the Web Component specification The micro frontend is declared as a custom element in the React application In the src custom elements folder you can find a SeedscardDetailsElement js file that defines the whole component by implementing the HTMLElement interface const ATTRIBUTES cardname cardname class SeedscardDetailsElement extends HTMLElement onDetail createWidgetEventPublisher OUTPUT EVENT TYPES transactionsDetail constructor args super args this mountPoint null this unsubscribeFromKeycloakEvent null this keycloak getKeycloakInstance static get observedAttributes return Object values ATTRIBUTES attributeChangedCallback cardname oldValue newValue if Object values ATTRIBUTES includes cardname throw new Error Untracked changed attribute cardname if this mountPoint amp amp newValue oldValue this render connectedCallback this mountPoint document createElement div this appendChild this mountPoint const locale this getAttribute locale en inext changeLanguage locale this keycloak getKeycloakInstance initialized true this unsubscribeFromKeycloakEvent subscribeToWidgetEvent KEYCLOAK EVENT TYPE gt this keycloak getKeycloakInstance initialized true this render this render render const customEventPrefix seedscard details const cardname this getAttribute ATTRIBUTES cardname const onError error gt const customEvent new CustomEvent customEventPrefix error details error this dispatchEvent customEvent const ReactComponent React createElement SeedscardDetailsContainer cardname onError onDetail this onDetail ReactDOM render lt KeycloakContext Provider value this keycloak gt ReactComponent lt KeycloakContext Provider gt this mountPoint disconnectedCallback if this unsubscribeFromKeycloakEvent this unsubscribeFromKeycloakEvent if customElements get sd seeds card details customElements define sd seeds card details SeedscardDetailsElement We can see the cardname attribute is passed to the custom element to switch between the different kinds of data we want to retrieve The sd seeds card details tag can be used to instantiate a new component Here is an example from the public index html where the default cardname is “checking lt body onLoad onLoad gt lt noscript gt You need to enable JavaScript to run this app lt noscript gt lt sd seeds card details cardname checking gt lt sd seeds card config gt lt body gt Calling the Banking APIThe Banking API exposed some endpoints generated from the JHipster entities declarations The MFE is able to consume this API through HTTP calls The src api seedscard js file contains the endpoint definitions import DOMAIN from api constants const getKeycloakToken gt if window amp amp window entando amp amp window entando keycloak amp amp window entando keycloak authenticated return window entando keycloak token return const defaultOptions gt const token getKeycloakToken return headers new Headers Authorization Bearer token Content Type application json const executeFetch params gt const url options params return fetch url method GET defaultOptions options then response gt response status gt amp amp response status lt Promise resolve response Promise reject new Error response statusText response status then response gt response json export const getSeedscard params gt const id options cardname params const url DOMAIN DOMAIN endsWith banking api cardname s id return executeFetch url options export const getSeedscardByUserID params gt const userID options cardname params const url DOMAIN DOMAIN endsWith banking api cardname s user userID return executeFetch url options The requests defined here are flexible enough to be used with multiple types of credit cards This is why the path depends on the cardname and the userID banking api cardname s user userID Render banking infoThe src components folder contains the rendering part with both SeedcardDetails js and SeedcardDetailsContainer js const SeedscardDetails classes t account onDetail cardname gt const header lt div className classes SeedsCard header gt lt img alt interest account icon className classes SeedsCard icon src seedscardIcon gt lt div className classes SeedsCard title gt t common widgetName widgetNamePlaceholder cardname replace w c gt c toUpperCase lt div gt lt div className classes SeedsCard value gt account amp amp account id amp amp account accountNumber substring account accountNumber length account accountNumber length lt div gt lt div className classes SeedsCard action gt lt i className fas fa ellipsis v gt lt div gt lt div gt return eslint disable next line jsx ay click events have key events jsx ay no static element interactions lt div onClick account amp amp account id gt onDetail cardname accountID account id null gt lt div className classes SeedsCard gt account amp amp account id lt gt header lt p className classes SeedsCard balance gt account balance toString replace B lt d d d g lt p gt lt p className classes SeedsCard balanceCaption gt Balance lt p gt account rewardPoints amp amp lt p className classes SeedsCard balanceReward gt Reward Points lt span className classes SeedsCard balanceRewardValue gt account rewardPoints lt span gt lt p gt lt gt lt gt header lt p className classes SeedsCard balanceCaption gt You don amp apos t have a cardname account lt p gt lt gt lt div gt lt div gt The SeedcardDetailsContainer js handle the API call getSeedscardByUserID userID cardname then account gt this setState notificationStatus null notificationMessage null account if cardname checking amp amp firstCall onDetail cardname accountID account id catch e gt onError e finally gt this setState loading false When the Widget is deployed the requests contain the right Card name value and the data retrieved matches with it here is the first screenshot from the Dashboard Configure the Widget in the Entando PlatformAs Entando wraps the micro frontend as a widget it can come with a configuration widget to set values such as the cardname This allows you to change the cardname value from Entando App Builder without needing to deploy the micro frontend again To access that you need to design a page click on a widget kebab menu and click on settings The settings menu is only present when a configuration widget is provided with a widget What s nextIn this article we saw a lot of code from the data layer with the domain definition to the data rendering in the micro frontend to display the CreditCard information The next blog post will dive into the CMS components of the Standard Banking Demo It will contain less code and will focus more on the Standard Banking Demo bundle by explaining the different CMS components you can use to build the content of your pages 2021-09-16 16:31:08
海外TECH DEV Community Protect your Repository from Azure Pipelines! https://dev.to/n3wt0n/protect-your-repository-from-azure-pipelines-3c0f Protect your Repository from Azure Pipelines In Azure DevOps you can add checks and pipeline permissions to your repository to have more control on what pipelines can and cannot do with your code And today I m gonna show you how to do it VideoAs usual if you are a visual learner or simply prefer to watch and listen instead of reading here you have the video with the whole explanation and demo which to be fair is much more complete than this post Link to the video If you rather prefer reading well let s just continue Why Protect a Repo Today we talk about using Repos as protected resources in YAML Pipelines to have a more granular level of security on them So first things first why would you even want to use this feature Let me give you an example It is fairly common having multiple repositories in an Azure DevOps project for example to host many sub projects and each repository may have one or more pipelines In this scenario by default the Pipelines can access the code on every and each repository in the Team project You may want however to control which pipelines can access which repositories For example let s say you have two repositories A and B in the same project and two pipelines X and Y that normally build these repositories You may want to prevent pipeline Y from accessing repository A In general you want the contributors of A to control which pipelines they want to provide access to As a contributor of repo A you can add checks and pipeline permissions to your repository Let s see how to do this How To Do ItTo access those settings navigate to the Project Settings select Repositories and then the repo you want to add checks to You will notice a new menu called Approvals and Checks in the ellipses where you can configure any of the approvals and checks like you have on Service Connections Environments etc And Under the Security tab you can manage the list of users pipelines and even branches and tags that can access the repository Anytime a YAML pipeline uses a repository the Azure Pipelines infrastructure verifies and ensures that all the checks and permissions are satisfied Remember that these permissions and checks are applicable only to YAML pipelines Classic pipelines do not recognize these new features ConclusionsHope this was helpful let me know in the comment section below if you have any question and what are your scenarios for using this feature Also check out this video here where I talk about How to disable a repository in Azure Repos and why you may do it Like share and follow me for more content YouTube Buy me a coffeePatreonNewsletterCoderDave io WebsiteMerchFacebook page‍GitHubTwitterLinkedInPodcast 2021-09-16 16:29:30
海外TECH DEV Community Zero Trust: Is it anything new? https://dev.to/coroner/zero-trust-is-it-anything-new-1onb Zero Trust Is it anything new In theory it isn t particularly new The term zero trust has been around for more than years De perimeterisation the main concept behind Zero Trust Architecture was defined and promoted on the Jericho Forums which was founded years ago Even the management of risks associated with de perimeterisation were discussed almost two decades ago John Kindervag coined the concept while he was at Forrester in and Google implemented a Zero Trust Architecture framework referred to as BeyondCorp in the same year Even so in practice Zero Trust should mean more than just a marketing hype especially given that Joe Biden has ordered that “the Federal Government must adopt security best practices advance toward Zero Trust Architecture The publication of NIST can serve as both a theoretical and practical guideline which should be applied to achieve worthwhile changes But what are these theories and practices and why they are so important Let s take a look What is Zero Trust It should be pointed out that Zero Trust is not a product but a model Though it can be facilitated by one or more products it primarily necessitates a change in approach Before common sense was that a private network has definite perimeters with a small number of entry points and the goal was to protect them This way of thinking bears the strategic approach of the late medieval and early modern period The defense of an area with definite boundaries and the assets concentrated behind the walls of the fortress Both attacking and defending armies were mostly focused on the entry point just like red and blue teams are focused on network defense tools in this castle and moat network security model However it is well known as it was in the medieval period that there is a much easier and more profitable way than a siege namely sabotage In the castle and moat mode if the authentication is circumvented at the entry point there are no other mechanisms to prevent malicious activity as you are inside the perimeter You are trusted if you are inside the perimeter this could be the motto of any malware developer Zero Trust Architecture is looking to overtake this old fashioned perimeter approach There are significant changes in perimeter approach which makes the rise of Zero Trust quite timely Before there was a dogma that it was hard to obtain access to private resources from outside the private network so successfully authenticated users could access any resources on the private network In the age of virtual private networks VPN and cloud services private resources can be obtained very easily from the internet as there are no definite perimeters with just a small number of entry points This means that the way we defend the assets of the organization should change Any access to any resources by any user or machine must be authenticated and authorized independently of whether the resource tried to be accessed from inside or outside of the organization s private network Zero Trust means that lack of authentication means no trust is given at all Access can be given after successful authentication but in a restricted manner just like in real life Network security is no different from other types of security it uses the same tenets and learns from the history of them all as discussed above Everything is a ResourceZero Trust Architecture requires us to consider all data sources and computing services as resources with no exceptions even if the network might be composed of multiple classes of devices Practically speaking this means there should be one or more control points Policy Enforcement Point in the network where all the network traffic goes through and where the policy can be enforced As a result the castle and moat security model is completely inadequate With Zero Trust there is no resource concentration no definite perimeters and the focus is on the traffic paths of the communication instead of entry points As traffic paths can be controlled comprehensively in computer networks there is no need to control the entry point itself It is necessary to segment the network as much as possible and separate these segments from each other This technique is known as micro segmentation as it creates several micro perimeters or segments in the network As the crossing between these micro segments are controlled and transit is permitted in a restricted manner accurate authentication and authorization can be performed at the borders The situation is the same as it is with real life borders except that there are no or at least there shouldn t be green borders in computer networks Lateral movement cannot be performed in the network as it is no longer hierarchical and there are no resources of distinct importance as all resources are treated equally meaning access to all resources is verified independently from the classification of the resource just like any passengers are authenticated at the borders independently of whether the passenger is a particularly important person or not Secure CommunicationSecure communication is an essential part of the Zero Trust Security Model for several reasons Secure communication provides confidentiality integrity and authenticity Authenticity makes it possible for the communicating parties to identify each other and also makes it possible for the Policy Engine to identify the source of communication The Policy Engine can then make a decision about whether access can be granted to a resource for a given subject which will be enforced at the Policy Enforcement Point Confidentiality inhibits the passive attacker to get credentials or other valuable information by eavesdropping on the network which can be used during an active attack Integrity ensures that the communication cannot be altered without the knowledge of the communication parties making it impossible to modify sensitive information such as a bank account number or invoice amount in order to add misleading information or fishing for part of the original content Session Based AccessAccording to the Zero Trust tenets access to the resources are granted in a session based manner Both authentication and authorization are session based and the users must be granted only the level of access needed to fulfill their role which means we must follow the least privilege principle A session based approach guarantees a time limitation as the access to a resource is not necessarily granted in a subsequent session or with the same privileges as privileges should also be limited to those that are strictly necessary session by session Strictly Enforced Authentication and AuthorizationAs has already been mentioned the basic concept is that no one is trusted by default from either inside or outside the network Authentication and authorization are always checked at each access request before access is granted to an organizational resource though a question arises of how a user can be authenticated The most used authentication mechanisms are indirect meaning they cannot supply direct evidence to the user s identity just certain factors such as something the user knows knowledge something the user has possession or something the user is inherence assuming the exclusivity of knowledge possession or inherence Single factor like a password might be compromised but the probability of compromising multiple factors with different type is negligibly low which is why it is so important to use multi factor authentication One fundamental problem of identification by knowledge is that if it is unchanged over a long time just like a password or a certificate and becomes compromised it does not identify the user yet the abuse is hard to detect Credentials that change over a short period of time such as a Time based One Time Password TOTP are one option but this solution cannot solve the problem on its own as an attacker who has stolen the shared secret which is also a long term credential can generate a valid TOTP However combined with a possession based factor this can help to identify the human itself instead of just their knowledge This is especially true when accessing the TOTP generator with software or hardware tokens that can be accessed after an inherence based identification such as unlocking a mobile device or a security token by fingerprint However for the user or client in general identification is just one factor in dynamic polices The identification process can also encompass any associated attributes assigned by the enterprise to an account Characteristics of the device used by the client such as software versions installed patch level network or physical location time and date of request and previously observed behavior can also be part of the verification of a client and can also determine the applied policy Behavioral attributes can also be measured and deviations can be checked against the observed usage patterns before access is granted to a particular resource The sensitivity and the classification of the resource should also vary according to the conditions of the resource access For instance under certain circumstances only read only access is granted to a particular resource but after additional authentication by a second or a third factor read write access can be provided The situation here is the same as it is in physical security where entering a higher classified place requires additional authentication In terms of network and data security higher data acts like a location in physical security Monitoring devices in real timeEstablishing a continuous diagnostics and mitigation CDM system is also a requirement of Zero Trust Architecture Knowing the current security related state of the network and the actors involved is essential as restrictions should be applied on a client or a server when a security issue can be assumed to be related to them For instance if a device runs a service that has a remotely executable vulnerability which is currently unpatched the access of the affected service should be limited until the service is patched to mitigate the vulnerability To be able to do that it is also necessary to have the information that there is a security issue in the organization This information can come from a CDM system and may imply a change in the earlier mentioned dynamic policies once a security issue is recognized and subsequently fixed Appearance of a new device on the network is a typical scenario where monitoring is essential as rules must be applied to the network traffic of the newly appeared device Zero Trust requires that we do not trust in a device just because it is inside the private network so the rule could simply lead to a denial However it is also possible that only one path should be opened which makes possible to register the device on the network for the user especially if it is a mobile or a bring your own device which can access only a limited part of the network with limited privileges Independently from the applied policies the information about the fact that there is an unregistered device that has appeared on the network which tries to communicate is a must as it could indicate a legitimate usage of the network but also an illegitimate or at least a suspicious one Not just the devices but the network traffic they generate should also be monitored As part of the incident management during investigation we will need all available information Before any incident occurs changes in resource accesses may indicate a security issue For instance requesting a higher level of privilege during a resource access like requesting writing permission instead of the ordinary read only one or requesting it from an unusual network location like from a foreign country the organization has no connection with at an unusual time like at midnight in case of a colleague that works to or trying to discover the network may all indicate the presence of a malicious software that may generate input to the CDM causing quarantine of the device to prevent the spread of ransomware for instance ConclusionThe NIST does not articulate any requirements of network security in its Zero Trust publications that would not have already been articulated by others before but it does so in a way that makes it possible to reach not only C level executives but also state leaders as it has influenced the Biden Administration s plans for strengthening US cybersecurity Leading technology research firms such as Gartner and Forrester also promote the Zero Trust model which makes the concept almost unavoidable on providers side and also generates hype about the topic Beyond business considerations we should keep the basic statement of Zero Trust in mind there could be attackers both inside and outside of the organization so we should never simply trust but always verify and enforce the principle of least privilege Zero Trust Is it anything new is licensed under a Creative Commons Attribution ShareAlike International License 2021-09-16 16:19:22
Apple AppleInsider - Frontpage News Claris & EonXI partner to increase diversity in tech https://appleinsider.com/articles/21/09/16/claris-eonxi-partner-to-increase-diversity-in-tech?utm_medium=rss Claris amp EonXI partner to increase diversity in techApple spin off Claris is launching a new entrepreneur training program in partnership with venture capital fund EonXI to improve diversity in technology Credit ClarisThe new initiative will focus on low code software development business training mentorship and networking with the end goal of expanding diversity in technology and business ownership Claris announced Thursday Read more 2021-09-16 16:38:44
Apple AppleInsider - Frontpage News Apple's brand-new iPad 9th Gen is discounted to $299 at Walmart right now https://appleinsider.com/articles/21/09/16/apples-brand-new-ipad-9th-gen-is-discounted-to-299-at-walmart-right-now?utm_medium=rss Apple x s brand new iPad th Gen is discounted to at Walmart right nowIn what is the steepest preorder discount we ve seen Walmart has Apple s new iPad th Generation marked down to iPad for Walmart s preorder discount on the new iPad th Generation beats Apple s own student discount by and is open to everyone and not just college students At press time the promo price applies to the GB Wi Fi spec in your choice of Space Gray or Silver Read more 2021-09-16 16:51:46
Apple AppleInsider - Frontpage News Everything released at Apple's iPhone 13 release event - and what we thought https://appleinsider.com/articles/21/09/15/everything-released-at-apples-iphone-13-release-event---and-what-we-thought?utm_medium=rss Everything released at Apple x s iPhone release event and what we thoughtThere was a lot to process during Apple s California Streaming event on Tuesday So here s everything you need to know about what was released ーand what we think were the best and worst releases of the day On Tuesday Apple held its yearly iPhone release event which heralded the much anticipated iPhone alongside the Apple Watch Series However those were far from the only things released iPad Read more 2021-09-16 16:05:47
海外TECH Engadget Ford will spend $250 million to boost F-150 Lightning production https://www.engadget.com/ford-ramps-up-f-150-lightning-production-165311045.html?src=rss Ford will spend million to boost F Lightning productionFord s electric F Lightning is clearly in high demand and the company is determined to keep up The automaker has paired news of pre production work with a promise to invest an extra million and create new jobs to increase production capacity That should help Ford build Lightning trucks per year ーlittle comfort when the company now has reservations but the move should reduce wait times Most of the jobs will go to workers assembling the electric F at the Rouge Electric Vehicle Center in Dearborn Michigan while others will build more batteries at the Rawsonville Components Plant and motors at the Van Dyke Electric Powertrain Center The first trucks should be available in spring The production numbers won t compete with conventional trucks for a while As Autoweekobserved Ford averaged sales of about regular F trucks per year before the pandemic and chip shortages came into play While the Lightning may be more than a niche product it s not yet at the point where Ford would have to reconsider its conventional truck production There s also a certain amount of posturing involved with the news Ford is clearly eager to please a government promoting made in America EVs However it s still a recognition of pent up demand for electric pickups both from Ford and from the industry as a whole Not that Ford might have much choice With Rivian already producing its first trucks Ford risks losing sales to competitors if it doesn t ramp up manufacturing 2021-09-16 16:53:11
海外TECH Engadget Boss' SY-200 is a powerful guitar synth that fits on a pedalboard https://www.engadget.com/boss-sy-200-guitar-pedal-synth-superbooth-164427743.html?src=rss Boss x SY is a powerful guitar synth that fits on a pedalboardBoss is certainly no stranger to the world of guitar synths In fact Roland and Boss have been at the forefront of guitar synths and MIDI controllers since the s After launching the absolutely epic SY in then cramming a bunch of synth sounds into an actual guitar Eurus earlier this year Boss is going a little more traditional with the SY nbsp The SY isn t quite as big as the which is basically a pedalboard in and of itself But it s definitely larger and more comprehensive than the compact Boss pedals you re probably familiar with like the SY synth The has different sounds spread across different categories and can be played without the need for a special pickup nbsp Each voice has three parameters that you can customize which pales in comparison to the full on programable synth inside the SY but it s definitely a lot more approachable and pedalboard friendly You ve everything from ripping leads to warm pads to delicate bell tones at your disposal Though the Boss demo video above is real heavy on traditional guitar shredding Oh and it s fully polyphonic which we ve come to expect from Boss synth pedals but it s still worth calling out There s two foot switches for giving you some control over live variation while playing but you can also connect an expression pedal or control parameters and program changes via MIDI You ve got preset slots for saving and recalling your favorite sounds And last but definitely not least there are send and return jacks for blending in other effects in parallel with your synth sounds nbsp The Boss SY will be available for in January alongside Boss new IR based amp and cab simulator the IR which will retail for nbsp 2021-09-16 16:44:27
海外TECH Engadget Cadillac Lyriq EV reservations open on September 18th https://www.engadget.com/cadillac-lyriq-ev-reservation-date-163044124.html?src=rss Cadillac Lyriq EV reservations open on September thCadillac is preparing to leap into the electric vehicle market with the Lyriq and now the automaker has revealed when you ll be able to lock in your reservation You ll get your first chance to lay claim to a Lyriq this Saturday September th at PM ET Cadillac will host a two hour livestream on the YouTube masthead in the lead up to reservations opening The Lyriq starts at and has a range of over miles It has a giant inch wraparound display and it s built on parent company GM s Ultium battery platform Cadillac plans to release its first EV in the first half of 2021-09-16 16:30:44
海外TECH Engadget WhatsApp starts testing local business directories https://www.engadget.com/whatsapp-business-directory-test-sao-paulo-161835714.html?src=rss WhatsApp starts testing local business directoriesWhatsApp already allows you to chat with businesses but you may soon also have the ability to find them through the app as well This week the company started testing a directory feature that allows users to scan through local shops and services that have a presence on WhatsApp and contact them The tool is currently only available in São Paulo Brazil but a screenshot shared by Will Cathcart the head of WhatsApp shows that you can use the feature to sort businesses by category and how close they are to you Matt Idema vice president of business messaging at Facebook told Reuters nbsp the test involves “thousands of shops and services He added the company is likely to make the feature available in India and Indonesia next “Based on feedback from the people who try it over the next few months we ll look at expanding this service to other cities and other types of businesses available on WhatsApp Cathcart said separately on Twitter While it s best known as an app you use to chat with your friends and family WhatsApp has increasingly pushed into the e commerce space Since it has offered a separate app for businesses to use to communicate with their customers More recently it s gone out of its way to make it easier for people to shop directly from WhatsApp At times that hasn t always worked out for the company as was the case when it changed its privacy policy earlier in the year On that note Cathcart said WhatsApp won t log the location of a user or the businesses they browse when using the directory feature 2021-09-16 16:18:35
海外TECH Network World Palo Alto shapes SASE package for hybrid enterprises https://www.networkworld.com/article/3633652/palo-alto-shapes-sase-package-for-hybrid-enterprises.html#tk.rss_all Palo Alto shapes SASE package for hybrid enterprises Palo Alto Networks has bolted together its SD WAN and security technologies to offer an integrated cloud based secure access service edge SASE offering aimed at simplifying distributed enterprises Called Prisma SASE the package brings together the company s core Prisma Access package of cloud based next generation security gateways with its Prisma SD WAN technology it got when it bought CloudGenix for million last year To read this article in full please click here 2021-09-16 16:32:00
Linux OMG! Ubuntu! Ubuntu 21.10 Default Wallpaper Revealed http://feedproxy.google.com/~r/d0od/~3/2i9a2GXfN2o/ubuntu-21-10-default-wallpaper-revealed Ubuntu Default Wallpaper RevealedUbuntu s new default wallpaper has been revealed As expected the new background doesn t deviate too far from the traditional template and continues the trend of putting a large animal mascot face at the center of a purple and orange gradient You may notice that the mascot artwork of the Indri itself is less stylised than in previous releases We ve had oodles of origami inspired icons Yakkety Yak Zesty Zapus ample angular and or geometric motifs Groovy Gorilla Disco Dingo and a clutch of companions composed entirely of intersecting concentric rings Bionic Beaver Cosmic Cuttlefish Eoan Ermine Hirsute Hippo Indri is a This post Ubuntu Default Wallpaper Revealed is from OMG Ubuntu Do not reproduce elsewhere without permission 2021-09-16 16:17:21
海外科学 NYT > Science Exxon, Chevron, BP and Others Called to Testify on Climate Disinformation https://www.nytimes.com/2021/09/16/climate/exxon-oil-disinformation-house-probe.html Exxon Chevron BP and Others Called to Testify on Climate DisinformationExecutives from Exxon Shell BP and others are being called to testify in Congress next month after a secret recording this year exposed an Exxon official boasting of such efforts 2021-09-16 16:42:40
海外TECH WIRED Join Us for RE:WIRED—Conversations on Humanity’s Biggest Bets https://www.wired.com/story/rewired-conversations-about-humanitys-biggest-bets Join Us for RE WIREDーConversations on Humanity s Biggest BetsThis November some of the most interesting thinkers technologists and artists in the world will convene to discuss how tech is shaping our future 2021-09-16 16:04:13
ニュース BBC News - Home Aukus: China denounces US-UK-Australia pact as irresponsible https://www.bbc.co.uk/news/world-58582573?at_medium=RSS&at_campaign=KARANGA australia 2021-09-16 16:19:51
ニュース BBC News - Home Cladding: Panels failed fire tests 13 years before Grenfell https://www.bbc.co.uk/news/uk-58584348?at_medium=RSS&at_campaign=KARANGA cladding 2021-09-16 16:05:18
ニュース BBC News - Home John Lewis charters ships to ensure Christmas stock arrives https://www.bbc.co.uk/news/business-58581812?at_medium=RSS&at_campaign=KARANGA holiday 2021-09-16 16:06:19
ニュース BBC News - Home Mother wins court case over Staffordshire landfill site emissions https://www.bbc.co.uk/news/uk-england-stoke-staffordshire-58577136?at_medium=RSS&at_campaign=KARANGA emissions 2021-09-16 16:15:39
ニュース BBC News - Home Covid: First booster jabs for NHS staff and calls for military help https://www.bbc.co.uk/news/uk-58587087?at_medium=RSS&at_campaign=KARANGA coronavirus 2021-09-16 16:21:30
ニュース BBC News - Home Analysis: Town Hall nerves over Gove's new role https://www.bbc.co.uk/news/uk-politics-58583104?at_medium=RSS&at_campaign=KARANGA analysis 2021-09-16 16:19:08
ニュース BBC News - Home Covid-19 in the UK: How many coronavirus cases are there in my area? https://www.bbc.co.uk/news/uk-51768274?at_medium=RSS&at_campaign=KARANGA cases 2021-09-16 16:22:40
ビジネス ダイヤモンド・オンライン - 新着記事 欧州でエネルギー価格急騰、世界の供給にまた打撃 - WSJ発 https://diamond.jp/articles/-/282521 欧州 2021-09-17 01:12:00
北海道 北海道新聞 道、参加意向表明へ 10月の行動制限緩和実験 札幌や旭川など検討 https://www.hokkaido-np.co.jp/article/590052/ 新型コロナウイルス 2021-09-17 01:09:05
GCP Cloud Blog Upgrade Postgres with pglogical and Database Migration Service https://cloud.google.com/blog/topics/developers-practitioners/upgrade-postgres-pglogical-and-database-migration-service/ Upgrade Postgres with pglogical and Database Migration ServiceAs many of you are probably aware Postgres is ending long term support for version in November However if you re still using version there s no need to panic  Cloud SQL will continue to support version for one more year after in place major version upgrades becomes available But if you would still like to upgrade right now Google Cloud s Database Migration Service DMS makes major version upgrades for Cloud SQL simple with low downtime This method can be used to upgrade from any Postgres version or later In addition your source doesn t have to be a Cloud SQL instance You can set your source to be on prem self managed Postgres or an AWS source to migrate to Cloud SQL and upgrade to Postgres at the same time DMS also supports MySQL migrations and upgrades but this blog post will focus on Postgres If you re looking to upgrade a MySQL instance check out Gabe Weiss s post on the topic Why are we here You re probably here because Postgres will soon reach end of life Otherwise you might want to take advantage of the latest Postgres features like incremental sorting and parallelized vacuuming for indexes Finally you might be looking to migrate to Google Cloud SQL and thinking that you might as well upgrade to the latest major version at the same time  Addressing version incompatibilitiesFirst before upgrading we ll want to look at the breaking changes between major versions Especially if your goal is to bump up multiple versions at once for example upgrading from version to version you ll need to account for all of the changes between those versions You can find these changes by looking at the Release Notes for each version after your current version up to your target version For example before you begin upgrading a Postgres instance you ll need to first address the incompatibilities in version including renaming any SQL functions tools and options that reference “xlog to “wal removing the ability to store unencrypted passwords on the server and removing support for floating point timestamps and intervals  Preparing the source for migrationThere are a few steps we ll need to take before our source database engine is ready for a DMS migration A more detailed overview of these steps can be found in this guide  First you must create a database named postgres on the source instance This database may already exist if your source is a Cloud SQL instance Next install the pglogical package on your source instance DMS relies on pglogical to transfer data between your source and target instances If your source is a Cloud SQL instance this step is as easy as setting the cloudsql logical decoding and cloudsql enable pglogical flags to on  Once you have set these flags restart your instance for them to take effect This post will focus on using a Cloud SQL instance as the source but you can find instructions for RDS instances here and foron prem self managed instances here If your source is a self managed instance i e on Compute Engine an on premises instance or an Amazon RDS Aurora instance this process is a little more involved  Once you have enabled the pglogical flags on the instance you will need to install the extension on each of your source databases that is not one of the following template databases template and template If you are using a source other than Cloud SQL you can check here to see what source databases need to be excluded If you re running Postgres or later on your source instance run CREATE EXTENSION IF NOT EXISTS pglogical on each database in the source instance that will be migrated  Next you ll need to grant privileges on the to be migrated databases to the user that you ll be using to connect to the source instance during migration Instructions on how to do this can be found here When creating the migration job you will enter the username and password for this user when creating a connection profile Creating the migration job in DMSThe first steps for creating a migration job in DMS are to define a source and destination When defining a source you ll need to create a connection profile by providing the username and password of the migration user that you granted privileges to earlier as well as the IP address for the source instance The latter will be auto populated if your source is a Cloud SQL instance Next when creating the destination you ll want to make sure that you have selected your target version of Postgres After selecting your source and destination you choose a connectivity method see this very detailed post by Gabe Weiss for a deep dive on connectivity methods and then run a test to make sure your source can connect to your destination Once your test is successful you re ready to upgrade Once you start the migration job data stored in the two instances will begin to sync It might take some time until the two instances are completely synced You can periodically check to see whether all of your data has synced by following the steps linked here All the while you can keep serving traffic to your source database until you re ready to promote your upgraded destination instance Promoting your destination instance and finishing touchesOnce you ve run the migration there are still a few things you need to do before your destination instance is production ready First make sure any settings you have enabled on your source instance are also applied to your destination instance For example if your organization requires that production instances only accept SSL connections you can turn on the enforce SSL flag for your instance  Some system configurations such as high availability and read replicas can only be set up after promoting your instance  To reduce downtime DMS migrations run continuously while applications still use your source database However before you promote your target to the primary instance you must first shut down all client connections to the source to prevent further changes Once all changes have been replicated to the destination instance you can promote the destination ending the migration job More details on best practices when promoting can be found here Finally because DMS depends on pglogical to migrate data there are a few limitations of pglogical that DMS inherits The first is that pglogical only migrates tables that have a primary key Any other tables will need to be migrated manually To identify tables that are missing a primary key you can run this query There are a few strategies you can use for migrating tables without a primary key which are describedhere Next pglogical only migrates the schema for materialized views but not the data To migrate over the data first run SELECT schemaname matviewnameFROM pg matviews to list all of the materialized view names Then for each view run REFRESH MATERIALIZED VIEW lt view name gt Third pglogical cannot migratelarge objects Tables with large objects need to be transferred manually One way to transfer large objects is to use pg dump to export the table or tables that contain the large objects and import them into Cloud SQL The safest time to do this is when you know that the tables containing large objects won t change It s recommended to import the large objects after your target instance has been promoted but you can perform the dump and import steps at any time Finally pglogical does not automatically migrate users To list all users on your source instance run du Then follow the instructions linked here to create each of those users on your target instance  After promoting your target and performing any manual steps required you ll want to update any applications services load balancers etc to point to your new instance If possible test this out with a dev staging version of your application to make sure everything works as expected  If you re migrating from a self managed or on prem instance you may have to adjust your applications to account for the increased latency of working with a Cloud SQL database that isn t right next to your application You may also need to figure out how you can connect to your Cloud SQL instance There are many paths to connecting to Cloud SQL including the Cloud SQL Auth proxy libraries for connecting with Python Java and Go and using a private IP address with a VPC connector You can find more info on all of these connection strategies in the Cloud SQL Connection Overview docs We did it cue fireworks If you made it this far congratulations Hopefully you now have a working upgraded Cloud SQL Postgres instance If you re looking for more detailed information on using DMS with Postgres take a look at our documentation Related ArticleMySQL major version upgrade using Database Migration ServiceGoogle s Database Migration Service gives us the tool we need to perform Major Version upgrades for MySQL Read Article 2021-09-16 16:15:00

コメント

このブログの人気の投稿

投稿時間:2021-06-17 05:05:34 RSSフィード2021-06-17 05:00 分まとめ(1274件)

投稿時間:2021-06-20 02:06:12 RSSフィード2021-06-20 02:00 分まとめ(3871件)

投稿時間:2020-12-01 09:41:49 RSSフィード2020-12-01 09:00 分まとめ(69件)