IT |
ITmedia 総合記事一覧 |
[ITmedia News] NASAの宇宙望遠鏡「ジェームズ・ウェッブ」から星の初画像とセルフィー届く |
https://www.itmedia.co.jp/news/articles/2202/12/news031.html
|
itmedianewsnasa |
2022-02-12 07:13:00 |
AWS |
AWS |
Creating Panels Alerts Transforms and Dashboards with Amazon Managed Grafana | Amazon Web Services |
https://www.youtube.com/watch?v=cgG_taLvHbk
|
Creating Panels Alerts Transforms and Dashboards with Amazon Managed Grafana Amazon Web ServicesIn this video you ll see how you can add dashboards panels alerts and transformations in Amazon Managed Grafana With this solution you can visualize performance metrics from multiple data sources transform and query operational data and set alerts and notifications to proactively capture issues For more information on this topic please visit the resource below Subscribe More AWS videos More AWS events videos ABOUT AWSAmazon Web Services AWS is the world s most comprehensive and broadly adopted cloud platform offering over fully featured services from data centers globally Millions of customers ーincluding the fastest growing startups largest enterprises and leading government agencies ーare using AWS to lower costs become more agile and innovate faster AWS AmazonWebServices CloudComputing observability grafana monitoring |
2022-02-11 22:29:02 |
js |
JavaScriptタグが付けられた新着投稿 - Qiita |
npmでパッケージをインストールしたらnode-gypエラーが出た |
https://qiita.com/hikotaro_san/items/90258665b5bc2a771564
|
nodepregyperrbuilderror |
2022-02-12 07:47:32 |
海外TECH |
Ars Technica |
This Ancient Roman ceramic pot was probably a portable toilet, study finds |
https://arstechnica.com/?p=1833400
|
roman |
2022-02-11 22:19:14 |
海外TECH |
MakeUseOf |
How to See Hidden Files on Your Mac |
https://www.makeuseof.com/tag/show-hidden-files-mac/
|
files |
2022-02-11 22:45:12 |
海外TECH |
MakeUseOf |
How to Change the Default Account for Apple’s Notes App |
https://www.makeuseof.com/how-to-change-default-account-notes-iphone-mac/
|
account |
2022-02-11 22:30:12 |
海外TECH |
MakeUseOf |
Speed Up Chrome By Changing These 8 Flags |
https://www.makeuseof.com/tag/speed-up-chrome-by-changing-these-8-flags/
|
Speed Up Chrome By Changing These FlagsWhile many people claim Chrome is a memory hog it is possible to greatly improve your browser s speed by tweaking some amp quot flags amp quot Here are the eight best tweaks that you can make today |
2022-02-11 22:18:02 |
海外TECH |
DEV Community |
Javascript Objects |
https://dev.to/collinsonindo/javascript-objects-2gig
|
Javascript ObjectsJavascript works on the basis of object oriented programming This article describes how to use objects properties and methods and how to create your own objects What is an object In JavaScript an object is a standalone entity with properties and type They include string number boolean null and undefined For example a user object can have the following properties username email and gender Objects and propertiesA JavaScript object has properties associated with it A property of an object is a variable that is attached to the object The properties of an object define the characteristics of the object You access the properties of an object with a simple dot notation objectName propertyNameLike all JavaScript variables both the object name and property name are case sensitive You can define a property by assigning it a value To demonstrate this here s a sample object myDogconst myDog “name “Joe “legs “tails “friends “everyone “Name “legs “tails and “friends are properties while “Joe “ “ and “everyone are values Object MethodsMethods are actions that can be performed on objects const person firstName John lastName Doe id fullName function return this firstName this lastName Creating a JavaScript Object Using an Object LiteralThis is the easiest way to create a JavaScript Object Using an object literal you both define and create an object in one statement const person firstName John lastName Doe age eyeColor blue ConclusionObjects in JavaScript can be compared to objects in real life The concept of objects in JavaScript can be understood with real life tangible objects In JavaScript an object is a standalone entity with properties and type Compare it with a chair for example A chair is an object with properties A chair has a color a design weight a material it is made of The same way JavaScript objects can have properties which define their characteristics |
2022-02-11 22:51:53 |
海外TECH |
DEV Community |
Don't Panic! - We got a new place for our VSF Forum! |
https://dev.to/vue-storefront/dont-panic-we-got-a-new-place-for-our-vsf-forum-2i96
|
Don x t Panic We got a new place for our VSF Forum In the book series The Hitchhiker s Guide to the Galaxy Douglas Adams wrote that once the dolphins knew that the Earth was going to be destroyed for a hyperspatial express route being the most intelligent creatures on the planet the dolphins flew off the planet Earth saying the iconic phrase So long and thanks for all the fish But why this introduction on a subject that can sound so different All the most intelligent beings flew off to better places before the destruction We want to share with you the news We will fully archive our Vue Storefront Forum and move to the GitHub discussion This shifting will enable our community to be more engaged and also bonded with other communities The shift in two ActsJust like in a good movie or theater play we will divide this process into two acts ºAct For a moment nothing happens Then after about a second nothing continues to happen We will permanently archive and close the current Forum Making it only a consulting and historical place for all the developers and hitchhikers out there ºAct “If you want to survive out here you ve got to know where your towel is The GitHub discussion on each repository will be the place to go Each core repository and integration have their discussion tab enabled You can interact and participate in this process without losing the connection between any GitHub issue or PR on any repository and the ongoing discussion The ShowIf you read this the show has already started the Forum is already archived and the Vue Storefront GitHub discussion is live You can join us for Core Frameworks Vue Storefront Vue Storefront Storefront UI VSF Capybara Integrations Shopify Magento Spree Shopware PWA BigCommerce Vendure nopCommerce Odoo Prestashop Kibo Commerce Medusa WooCommerce See you on GitHub |
2022-02-11 22:24:54 |
海外TECH |
DEV Community |
Understand Kotlin Function Literal with Receiver by Example |
https://dev.to/vtsen/understand-kotlin-function-literal-with-receiver-by-example-1d13
|
Understand Kotlin Function Literal with Receiver by ExampleThis article provides some simple code examples of using function literal with receiver also known as lambda anonymous function with receiver This article was originally published at vtsen hashnode dev on Jan I came across this lambda syntax NavGraphBuilder gt Unit and it turns out it is called Function Literal with Receiver which is also known as Lambda Anonymous Function with Receiver The syntax looks like this Receiver Parameters →ReturnTypeThe following shows some examples of building a custom string using function literal with receiver Example Function Literal With Receiverfun buildCustomStringExample action StringBuilder String gt Unit String val stringBuilder StringBuilder stringBuilder action Example return stringBuilder toString action is the function literal lambda function with receiver StringBuilder is the receiver It acts like an extension function of StringBuilder which takes in the string as input parameter To call action StringBuilder is instantiated call it like an extension function stringBuilder action Example You can imagine action is like a callback function that belongs to StringBuilder UsageThis is the usage of function literal with receiver val output buildCustomStringExample content gt this append lt tag gt append content append lt tag gt println output We call the buildCustomStringExample with function literal lambda function parameter In this lambda function we specify how we build the custom string wrap the content with and lt tag gt content is the input parameter which is passed in from the buildCustomStringExample function this is the StringBuilder instance that created in buildCustomStringExample function and it can be omitted append is the function that belongs to StringBuilder OutputThe output looks like this lt tag gt Example lt tag gt Example Function Literal Without ReceiverFunction literal lambda function with receiver can be rewritten without using the receiver based on the following syntax Receiver Parameters →ReturnTypeThis is the usual lambda expressions which takes in parameters The first parameter is StringBuilder which is the receiver in example above fun buildCustomStringExample action StringBuilder String gt Unit String val stringBuilder StringBuilder action stringBuilder Example return stringBuilder toString action is the usual callback function which takes in parameters To call action StringBuilder is instantiated and passed in as the first parameter of the action callback function action stringBuilder Example UsageThis is the usage of function literal without receiver val output buildCustomStringExample stringBuilder content gt stringBuilder append lt tag gt stringBuilder append content stringBuilder append lt tag gt println output This is similar to example except we no longer use this which has been replaced by stringBuilder as the first lambda parameter content is the second parameter of the lambda function OutputThe output looks like this which has the same output as example lt tag gt Example lt tag gt ConclusionIt is not hard to understand function literal with receiver It is basically a simplified way to write normal function literal lambda function without additional parameter Function Literal with Receiver can be rewritten as Function Literal without Receiver Receiver Parameters →ReturnType gt Receiver Parameters →ReturnType See AlsoKotlin Tips and Tricks |
2022-02-11 22:06:19 |
海外TECH |
Engadget |
Some gendered slurs no longer on Wordle’s word list |
https://www.engadget.com/wordle-bans-gendered-profanity-slurs-223032715.html?src=rss
|
Some gendered slurs no longer on Wordle s word listHave you ever typed out an NSFW word or two or five on Wordle in a fit of frustration Well it s time for you to get your verbal recall skills out of the gutter The New York Times has yanked a handful of gendered slurs from Wordle s internal dictionary reported Polygon on Friday The words “bitch “whore and “sluts have been removed from the game s word list In other words no pun intended typing out any one of these terms will have the same effect as if you type out a string of random letters like “asjfk or “jkjkj a grey box will appear with the phrase “not in word list ーand you ll feel dumb And for what it s worth none of these expunged words have been solutions to prior Wordle puzzles and there s not much reason to believe they ever would have been in the future Not every profane term or curse word has been scrubbed off Wordle s list as of yet According to Engadget s research a number of slang terms for genitalia as well as some run of the mill curses still pass muster nbsp But given the Grey Lady s avoidance of bad language in both its news coverage and crosswords it may just be a matter of time One thing that is certain however is this you ll never see a winning word in TheNew York Times version of Wordle that you couldn t use in front of your grandmother “Offensive words will always be omitted from consideration a Times spokesperson told Polygon The viral game created by developer Josh Wardle for his partner was purchased by the Times for a seven figure amount late last month Wordle just migrated to the Time s website yesterday and there have been a couple of hiccups Some have noticed that their Wordle game statistics haven t automatically transferred over as the Times promised Other people have opinions on the new NYT like game interface and the likelihood that the game may soon be under the newspaper s paywall though it remains free for now If you re a naturally vulgar minded person don t despair There s always Lewdle Wordle s X rated cousin |
2022-02-11 22:30:32 |
海外TECH |
Network World |
Data center capex on the rise despite cloud momentum |
https://www.networkworld.com/article/3649751/data-center-capex-on-the-rise-despite-cloud-momentum.html#tk.rss_all
|
Data center capex on the rise despite cloud momentum Global capital expenditure on data center infrastructure is set to grow by over the next five years to a total of billion by in spite of the general move toward cloud in the enterprise according to a report released earlier this month by Dell Oro Group Part of that spending growth will be driven by hyperscalers like Google Amazon and Microsoft buying up data center equipment for their own public clouds but an underrecognized trend is that the cloud isn t for every organization according to the report s author research director Baron Fung To read this article in full please click here |
2022-02-11 22:12:00 |
海外科学 |
NYT > Science |
F.D.A. Clears Monoclonal Antibody Drug From Eli Lilly |
https://www.nytimes.com/2022/02/11/us/politics/eli-lilly-antibody-treatment-covid.html
|
F D A Clears Monoclonal Antibody Drug From Eli LillyThe federal government has ordered doses of the monoclonal antibody treatment which is meant for high risk Covid patients early in their illness |
2022-02-11 22:51:50 |
海外科学 |
NYT > Science |
Douglas Trumbull, Visual Effects Wizard, Dies at 79 |
https://www.nytimes.com/2022/02/11/movies/douglas-trumbull-dead.html
|
Douglas Trumbull Visual Effects Wizard Dies at His technical savvy was on display in films like “ A Space Odyssey “Star Trek the Motion Picture “Close Encounters of the Third Kind and “Blade Runner |
2022-02-11 23:00:11 |
海外科学 |
NYT > Science |
Fact-Checking Joe Rogan’s Interview With Robert Malone That Caused an Uproar |
https://www.nytimes.com/2022/02/08/arts/music/fact-check-joe-rogan-robert-malone.html
|
Fact Checking Joe Rogan s Interview With Robert Malone That Caused an UproarMr Rogan a wildly popular podcast host and his guest Dr Malone a controversial infectious disease researcher offered a litany of falsehoods over three hours |
2022-02-11 22:58:39 |
金融 |
ニュース - 保険市場TIMES |
e-Net少額短期保険、エルズサポートと業務提携 |
https://www.hokende.com/news/blog/entry/2022/02/12/080000
|
eNet少額短期保険、エルズサポートと業務提携家財保険料の家賃保証会社払いサービス提供eNet少額短期保険株式会社とエルズサポート株式会社は業務提携し、株式会社インサイトの決済代行サービスを利用した、家財保険料の家賃保証会社払いサービスを「LACTiiラクティ」で提供開始すると年月日に発表した。 |
2022-02-12 08:00:00 |
ニュース |
BBC News - Home |
PM sent Downing Street lockdown party questionnaire by police |
https://www.bbc.co.uk/news/uk-60356373?at_medium=RSS&at_campaign=KARANGA
|
parties |
2022-02-11 22:41:40 |
ニュース |
BBC News - Home |
Trucker protests: Ontario calls state of emergency |
https://www.bbc.co.uk/news/world-us-canada-60352980?at_medium=RSS&at_campaign=KARANGA
|
border |
2022-02-11 22:01:38 |
ニュース |
BBC News - Home |
Rangnick says 'it's obvious' Man Utd need to buy a striker this summer |
https://www.bbc.co.uk/sport/football/60353591?at_medium=RSS&at_campaign=KARANGA
|
needs |
2022-02-11 22:34:11 |
北海道 |
北海道新聞 |
新潟の工場で火災、4人心肺停止 安否不明も2人 |
https://www.hokkaido-np.co.jp/article/644825/
|
三幸製菓 |
2022-02-12 07:03:34 |
北海道 |
北海道新聞 |
カナダで州非常事態宣言 コロナ規制に大型車両デモ |
https://www.hokkaido-np.co.jp/article/644835/
|
非常事態宣言 |
2022-02-12 07:03:00 |
ビジネス |
東洋経済オンライン |
「FRBは株価を支えてくれる」と考えてはいけない インフレ対策に追われるアメリカの「因果応報」 | 新競馬好きエコノミストの市場深読み劇場 | 東洋経済オンライン |
https://toyokeizai.net/articles/-/510966?utm_source=rss&utm_medium=http&utm_campaign=link_back
|
因果応報 |
2022-02-12 07:30:00 |
GCP |
Cloud Blog |
Why you should be using Flex templates for your Dataflow deployments |
https://cloud.google.com/blog/topics/developers-practitioners/why-you-should-be-using-flex-templates-your-dataflow-deployments/
|
Why you should be using Flex templates for your Dataflow deploymentsLast year Google announced general availability of Dataflow Flex templates We covered many details of this new way to deploy Dataflow pipelines in this blog Here we offer additional tips and suggestions best practices details on using Google Artifact Repository for storing template s Docker images and ways to reduce cost for certain kinds of pipelines You can dive deeper by reviewing the GitHub repository that has the code that implements many of these suggestions While we use Java in our examples the majority of the recommendations also apply to Python pipelines If you are not familiar with Flex templates here are the main advantages to using them compared to direct launch of the pipeline or classic templates Better security modelFlex templates allow assigning the least required privileges to different actors in the pipeline life cycle Flex templates also help in the cases where the pipeline needs to access services only available on certain network subnetworks Ability to dynamically define pipeline graphsDataflow is a runner of pipelines coded using Apache Beam SDK Each Beam pipeline is a directed acyclic graph DAG of transforms Pipeline developers define this graph as a sequence of “appy transform methods Often it is useful to be able to run a single pipeline in slightly different configurations A typical example is a pipeline which can be run in the streaming mode to process Google Pub Sub messages or in the batch mode to backfill from the data residing on Google Cloud Storage GCS buckets The only difference between the two modes would be their starting transforms reading from sources but the rest of the graph is the same Interestingly enough Beam DAGs don t have to have a single starting source transform or be fully connected DAGs one pipeline can have a number of independent and completely different DAGs This can be a useful feature where a single pipeline can process a number of separate data sources in parallel for example when processing relatively small MBs to GB sized GCS files of different types Instead of running a pipeline per file you can run a pipeline per a group of files and each will have its own processing DAG Only direct deployment or Flex template support the ability to dynamically define pipeline DAGs We will show later an example of dynamic pipeline generation Ability to run pre processing steps before pipeline construction There are cases where a simple step performed before the pipeline is run can simplify the pipeline logic Flex templates allow a certain amount of pre processing to be run on a virtual machine VM with the pipeline credentials during the pipeline construction time Consider how you would build a pipeline to process a set of data files and a single record control file that provides the total number of records that must be present in the data files It s possible to construct a Beam pipeline where one of its DAG branches would read the control file create a single value side input and pass that input into some transform that would verify that the record count matches expected values But this whole DAG branch can be removed if you can read and validate the control file during the graph construction time and pass the expected number of records as a parameter to the record count checking transform Let s see how many of these concepts apply to a particular use case Imagine that you need to periodically synchronize data from an on premises relational database into BigQuery There are several constraints to your solution You need to use the pull approach the pipeline will be querying the database to pull the data changed since the last synchronization event While it is not the most efficient approach Change Data Capture is preferred for high volume synchronization it s still a viable solution in many situations The database contains sensitive data and can only be connected from a particular subnetwork in a Virtual Private Cloud VPC Database connection parameters including the password are stored in the Secret Manager The database schema is a typical snowflake schema with a number of infrequently changing dimension tables and more frequently changing fact tables We need to synchronize about tables orchestrating the synchronization using Cloud Composer If you were to implement this pipeline using the direct pipeline launch method or by creating a classic template you will face three main challenges Cloud Composer VMs need to be deployed in the same network as the database and the service account they run under would need to be able to read the Cloud Secrets both are the case of excessive permissions If you coded the pipeline to process one table at a time you will most likely incur unnecessary costs when processing dimension tables with very few changed records and would need to address the default Dataflow job number quota concurrent jobs per project Direct deployment method will require the pipeline code and a Java SDK to reside on the Composer VMs It typically requires additional steps in a Composer workflow to get the latest released code from the code repository with additional permissions to access that repository To understand the reason behind the first challenge let s recall the life cycle of a pipeline It s covered in the Dataflow documentation but here is a concise description of the three major phases Graph construction It happens when the main method of your pipeline s class is invoked and all these apply methods are called They in turn invoke the expand methods for each transform class and the pipeline object builds the execution graph of the pipeline Job creation Execution graph is transferred to the Dataflow service gets validated and a Dataflow job is created Execution The Dataflow service starts provisioning worker VMs Serialized processing functions from the execution graph and required libraries are downloaded to the worker VMs and the Dataflow service starts distributing the data bundles to be processed on these VMs Among the three possible ways to launch a pipeline the job creation and execution phases are identical The only difference is the timing location and security context of the graph construction phase Direct deployment The graph is constructed on the VM which executes the main method e g Cloud Composer VM and uses the credentials of the user or the service account associated with the VM Classic templates The graph is typically constructed on the VM used by the build process and uses the credentials associated with that VM Flex templates The graph is constructed at the pipeline launch time on a VM located in the same network as the Dataflow worker VMs and using the same service account as these worker VMs Only pipelines launched using Flex templates construct execution graphs in the same network location and use the same credentials as the VMs used in the execution phase Graph construction phase is where many Beam s standard transforms implement extensive validation of the input parameters and importantly call the Google Cloud APIs to get additional information Where it s run and what security context is used can determine if your pipeline will fail to launch or not In our use case Beam s JdbcIO transform can be used to query the data During the expand method execution a database connection is established and a query is parsed by the database in order to obtain the metadata of the result set It means that the invoker of the pipeline will need to have access to the database credentials and be on the same network as the database If you were to run Cloud Composer jobs using the direct launch method you would need to co locate Java code on the Composer VMs grant the service account Composer uses the ability to access Cloud Secrets and run the Composer VMs on the same network as the database By using the Flex template launch method you will enable clean separation of the invoker credentials Dataflow jobs will be using a dedicated Service Account which is the only account with read access to the Cloud Secrets and invocation location Cloud Composer can run in a different network than the database Next challenge how do you synchronize tables efficiently Ideally you want to start a pipeline which processes multiple tables in a single run Many of the sync jobs will be processing relatively small amounts of data and running a separate pipeline per table will create a lot of overhead Syncing all tables at once will make your DBAs unhappy A good middle ground is synching several tables at a time most likely based on their interdependencies Which means that our pipeline needs to have a list of sync jobs to run as a parameter “option in Beam parlance and dynamically build an execution graph based on this list Classic templates will not allow us to do that they have a limited capability for parametrization and the pipeline graph is determined at the template build time Flex templates handle this with ease using this pseudo code Here s how this dynamic graph would look like in a pipeline with three tables to synch Click to enlargeIf the number of database changes is low a single worker VM of the default n standard type can handle updates to dozens of tables at a time it can be a sizable cost saving compared to launching a pipeline per table The pipeline will autoscale if it determines that it would benefit from multiple workers to process the records Taken to the extreme you can use a once coded and staged Flex template to generate pipelines of unlimited complexity based on input parameters Well almost unlimited the size of the serialized graph is capped at MB at the time of this writing See this FAQ answer for more information We created a GitHub repo containing a sample pipeline that solves this use case Terraform scripts to create a sample environment and run a Cloud SQL instance to simulate the on premises database We will refer to several scripts in that repo in the discussion below Let s walk through some of the nuances of building staging and running Flex templates Building Flex templatesThis blog describes in great details various steps involved in building and staging Flex templates Here s a brief summary of the process a developer codes the pipeline compiles the code and creates a Docker image that is deployed in Google Artifact Registry or Container Registry Additionally a template specification file is created in a Google Cloud storage bucket The template specification file has a pointer to the newly created Docker image and the metadata information about the pipeline parameters name type description and validation rules Note that no pipeline code is actually run at this point and no graph is generated it is pure packaging of the code The gcloud dataflow flex template build command is all that you need to create and stage the template in a single step You give it the GCS location of the final template spec first positional TEMPLATE FILE GCS PATH parameter which will include the location of the Docker image SDK info and additional parameters If you don t provide a Docker image that you built by yourself via image parameter then this command will run a Cloud Build to build the image This image will be built based on a particular Google provided image flex template base image a bit more on that later and will know that it needs to run the main method of the class provided via env FLEX TEMPLATE JAVA MAIN CLASS parameter The image will be pushed into either Google Artifact Registry GAR or Google Container Registry GCR depending on the URL provided via image gcr path parameter This template building script has examples of both URLs You need to make sure that the user or service account which deploys the template has sufficient permissions to create artifacts in the destination registry and the worker service account needs to be able to read the registry This Terraform script sets the permissions needed to access both registries The code of the pipeline is added to the Docker image via jar parameter Notice that you don t have to create an Uber jar it s perfectly fine to have multiple jar parameters Our template building script generates the list of libraries to add based on the directory where Maven s dependency plugin produced all the project runtime and compile time dependencies If you use Gradle this Stackoverflow thread shows how you can copy project dependencies to a folder for later use by the build script You can also provide several default parameters and experiments that will be passed to the Dataflow run command that can be a good way to make sure that these parameters are always used even if forgotten to be included during the run stage If you plan to run your Flex template manually using Create Job from Template Custom Template option on the Dataflow Jobs console page you should add the template metadata name and description of the template list of parameters with help text and Javascript validation rules You can provide the metadata for a Flex template by deploying a separate file to the same GCS bucket as the template spec But a better alternative is to use the metadata file parameter with the name of the file on the local system the metadata will be embedded in the template spec Keeping the metadata file as part of your source code making updates to it as the pipeline changes and automatically including it in the template spec during the build process will ensure that the UI that the Dataflow console generates based on the metadata will match the current version of your pipeline CI CD considerationsYour Flex template will eventually be deployed to a number of environments e g integration testing staging production To make your deployments stable and reproducible you will need to use certain best practices and decide on several build and deployment details When you create a template you need to select the base Docker image to use which tag to use when publishing the newly created Docker image and whether to always create a new template spec or keep overwriting the existing one in GCS To help with the decisions keep in mind that from the deployment artifact point of view a Flex template is really just two physical pieces assuming you embedded the metadata in the template spec the template spec on GCS and the Docker container with all the pipeline code residing in a registry You might have some unique requirements in your organization but in most cases these policies should prevent many unpleasant surprises When you decide on how to build and tag the generated Docker images Don t use the “latest tag of the base image or JAVA or PYTHON shorthand that will resolve to the “latest tag Use this documentation page to get the stable tag of the latest image The resulting image path should look like this “gcr io dataflow templates base java template launcher base RC Treat upgrading the base images of your pipelines the same way you treat upgrading the Beam SDK or your logging framework check for the new tag with every code release or at least with some frequency update the base template and re test your pipeline Don t use “latest or another fixed tag for the Docker images that are built and deployed into the target registry that s the part after the colon in the image gcr path parameter This is especially important if you use the same registry to push the images during the template build phase and pull them during the launch phase You don t want your scheduled production pipeline to pull the latest Docker image that hasn t been fully tested yet Instead use a generated unique tag e g a timestamp based one The template spec file will have a reference to that unique tag and you will never need to worry about a particular spec file pulling the wrong image Related to the point above you should use some tag otherwise the registry will automatically assign the “latest tag with all the issues mentioned above What about the template spec name It mostly depends on how you are going to launch your pipelines referencing the same bucket used for the build or a separate bucket per environment where this pipeline is to run There are several options here are the most common ones Template spec path is unique per build e g gs my bucket templates my pipeline lt generated timestamp gt json You should be able to safely use a single bucket for both building templates and launching them in various environments The process of promoting a particular version to an environment should be just updating a name of the template file to the version to be used For example in Cloud Composer you could store that GCS file name in an environment variable This is the safest way to deal with pipeline versioning It also provides for a very simple rollback mechanism just change the pointer to the “last known good template spec Template spec path is fixed e g gs my bucket templates my pipeline json It can be useful for initial development but immediately after a pipeline is deployed into another environment you need to decide how to deal with versioning of the template spec If you end up copying the spec to other buckets for other environments you can turn on GCS versioning and copy specific versions generations in GCS parlance of the template to the destination environment But it will add an extra complexity to your process Template file is not generated in GCS at all during the template build You can use print only option to output the content of the template file to standard output capture it and store elsewhere perhaps also in Artifact Registry and create the GCS template spec only when needed Another useful thing to do is to add the template name and template version as labels to the template spec This can be done by using additional user labels parameter Now every pipeline you start will have this information in the Job Info panel of the Dataflow UI This will help you with troubleshooting especially if you tagged your code base with a template version at the build time It is also useful for cost attribution and recurring pipeline performance comparisons these labels will be automatically attached to the worker VMs running your pipelines in addition to “dataflow job id and “dataflow job name automatically added by the Dataflow service and will appear in Billing Exports to BigQuery To summarize all of these nuances the best approach to naming tagging and labeling various parts of Flex templates is to generate a unique tag per build tag your code with it use this tag as the tag for the generated Docker image add this tag as the template version label to the template spec and make this tag part of the image spec file name Launching Flex templated pipelinesThere are a number of ways to run these pipelines all of them ultimately using the underlying Dataflow API To launch a pipeline from Cloud Composer you can use a special operator There are also automatically generated client libraries in multiple languages that wrap this Dataflow API e g for Java But the easiest way to do it from a machine where the gcloud SDK is installed is by using gcloud dataflow flex template run command Most of the parameters are standard Dataflow parameters applicable to all launch methods If you have List or Map parameters specifying them on the command line can be a bit tricky this page describes how to handle them One notable parameter is “additional user labels You can provide these labels ahead of time during the template build time or you can dynamically add them during each pipeline launch If you look at the console right after you launch the pipeline you will notice that your job is in “Queued status This status is “borrowed from FlexRS pipelines which can be truly queued until there are enough preemptible VMs to run the pipeline For Flex templates you can think of this status as “Launching the Dataflow service starts a launcher VM pulls the Docker image found in the template specification file and runs the main method of your pipeline with the parameters you specified The rest is as if you were running that main method from your development machine the only difference is that this launcher VM is already on the same network as the worker VMs will be and uses the same service account as will be used for the worker VMs The launcher VM is a low end machine you can t control its type The expectation is that starting the pipeline shouldn t be computationally or memory intensive Making a couple of API calls is perfectly fine our sample pipeline makes a call to the Secret Manager a JDBC s statement prepare calls and reads several small configuration files from Google Cloud Storage but don t try to run AI model training there Troubleshooting the launch processBut what if something goes wrong and one of these calls fail or there are other issues launching the pipeline You will know about it because the pipeline status will change to “Failed and you will see an error message “Error occurred in the launcher container Template launch failed See console logs The launcher VM s console logs should be available in the Dataflow Log panel or Cloud Logging Once the launch process is complete they are also copied from the launcher VM to the file listed in the very first log entry of the Dataflow job “Console log from launcher will be available at gs … To view these logs either use the console or “gsutil cat command You will find the exception and the stack trace there and that s the first step in finding out what happened That stack trace usually points to the culprit right away “insufficient permissions accessing a service is a popular one To troubleshoot the operator errors wrong parameters passed find the log entry that contains “Executing java cp template lt your main class gt lt parameters gt and review the parameter list There could be cases where your pipeline doesn t start because running the main method causes an out of memory exception You can customize the java start parameters by using the FLEX TEMPLATE JAVA OPTIONS environment variable as shown in our deployment script Again remember that the launcher VM is not a supercomputer ConclusionFor most production environments Dataflow Flex templates are the best way to stage and deploy Dataflow pipelines They support best security practices and provide the most flexibility in deployments AcknowledgmentsSpecial thanks to several Dataflow team membersーNick Anikin Engineering Shan Kulandaivel Sr Product Manager and Mehran Nazir Product Manager for contributing to this post Additional informationTurn any Dataflow pipeline into a reusable templateDataflow templates Video from Beam SummitDataflow Flex Templates documentationRelated ArticleTurn any Dataflow pipeline into a reusable templateFlex Templates allow you to create templates from any Dataflow pipeline with additional flexibility to decide who can run jobs where to Read Article |
2022-02-11 22:30:00 |
コメント
コメントを投稿