AWS |
AWS Big Data Blog |
Best practices for configuring your Amazon Elasticsearch Service domain |
https://aws.amazon.com/blogs/big-data/best-practices-for-configuring-your-amazon-elasticsearch-service-domain/
|
Best practices for configuring your Amazon Elasticsearch Service domainAmazon Elasticsearch Service Amazon ES is a fully managed service that makes it easy to deploy secure scale and monitor your Elasticsearch cluster in the AWS Cloud Elasticsearch is a distributed database solution which can be difficult to plan for and execute This post discusses some best practices for deploying Amazon ES domains |
2020-05-29 17:26:33 |
AWS |
AWS Developer Blog |
AWS SDK for C++ Version 1.8 – Developer Preview |
https://aws.amazon.com/blogs/developer/aws-sdk-for-c-version-1-8-developer-preview/
|
AWS SDK for C Version Developer PreviewWe re happy to share that version of AWS SDK for C is now in developer preview AWS SDK for C provides a modern C version C or later interface for Amazon Web Services AWS It is performant and fully functioning with low and high level SDKs and minimizes dependencies The AWS SDK for C … |
2020-05-29 17:27:57 |
AWS |
AWS Machine Learning Blog |
Designing human review workflows with Amazon Translate and Amazon Augmented AI |
https://aws.amazon.com/blogs/machine-learning/designing-human-review-workflows-with-amazon-translate-and-amazon-augmented-ai/
|
Designing human review workflows with Amazon Translate and Amazon Augmented AIThe world is becoming smaller as many businesses and organizations expand globally As businesses expand their reach to wider audiences across different linguistic groups their need for interoperability with multiple languages increases exponentially Most of the industry work is manual slow and expensive human effort with many industry verticals struggling to find a scalable reliable … |
2020-05-29 17:02:07 |
AWS |
AWS Management Tools Blog |
Implementing Serverless Transit Network Orchestrator (STNO) in AWS Control Tower |
https://aws.amazon.com/blogs/mt/serverless-transit-network-orchestrator-stno-in-control-tower/
|
Implementing Serverless Transit Network Orchestrator STNO in AWS Control TowerIntroduction Many of the customers that we have worked with are using advanced network architectures in AWS for multi VPC and multi account architectures Placing workloads into separate Amazon Virtual Private Clouds VPCs has several advantages chief among them isolating sensitive workloads and allowing teams to innovate without fear of impacting other systems Many companies are taking … |
2020-05-29 17:49:21 |
AWS |
AWS Startups Blog |
Beewise Combines IoT and AI to Offer an Automated Beehive |
https://aws.amazon.com/blogs/startups/beewise-combines-iot-and-ai-to-offer-an-automated-beehive/
|
Beewise Combines IoT and AI to Offer an Automated BeehivePrior to Beewise the latest beekeeping technologyーif you can call it thatーwas created in the s The “tech stack was a literal stack of wooden boxes called a beehive filled with honeycomb not to mention bees Today there s Beewise an Israeli startup leveraging IoT and AI to offer the first autonomous beehive |
2020-05-29 17:12:29 |
AWS |
AWS |
AWS DeepRacer League F1 ProAm Event: Week 3 |
https://www.youtube.com/watch?v=gW_FuQUpK9Y
|
AWS DeepRacer League F ProAm Event Week Join the race at It s week of Daniel Ricciardo Tatiana Calderón and Rob Smedley s machine learning journey Watch as their Time Trial and Object Avoidance models get faster and fasterーpreparing them for their first taste of head to head racing Got FOMO Find step by step tutorials training and services in the AWS DeepRacer console It s free to race all throughout May terms and conditions apply Watch AWS DeepRacer TV Subscribe to AWS Channel Start racing now DeepRacer Instagram AWS Twitter AWS LinkedIn AWS Instagram AWS Facebook AWSDeepRacer DanielRicciardo TatianaCalderón RobSmedley MachineLearning ReinforcementLearning AutonomousRacing AIML DeepRacer AWSDeepRacerF GrandPrix F Formula TataCalde RIC subject to terms and conditions see aws amazon com deepracer pricing for details |
2020-05-29 17:35:23 |
AWS |
AWS - Webinar Channel |
Centrally Manage AWS WAF and AWS Managed Rules Using AWS Firewall Manager - AWS Online Tech Talks |
https://www.youtube.com/watch?v=u27HLad-Wi8
|
Centrally Manage AWS WAF and AWS Managed Rules Using AWS Firewall Manager AWS Online Tech TalksCentral configuration and management of AWS WAF helps organizations ensure a consistent security posture especially for enterprises managing large numbers of accounts and resources In this tech talk we will cover how you can use AWS Firewall Manager for central management of the updated AWS WAF We will also cover how you can enable Managed Rules for AWS WAF a pre configured set of rules managed by AWS or AWS Marketplace Sellers to scale protection across your resources These rules are regularly updated as new issues emerge We will also go over some of the new features that Firewall Manager now supports for the updated version of WAF Learning Objectives Learn how to centrally manage the latest version of AWS WAF using AWS Firewall Manager Learn how you can enable preconfigured WAF rules such as AWS Managed Rules at scale across your resources Learn about the new features that Firewall Manager will support for the updated version of AWS WAF To learn more about the services featured in this talk please visit |
2020-05-29 17:31:49 |
python |
Pythonタグが付けられた新着投稿 - Qiita |
ツイ消しbot |
https://qiita.com/penta2019/items/31ace877cd2a73f46f76
|
ツイ消しbotツイートしてから一定期間を過ぎたツイートを削除するbotです。 |
2020-05-30 02:41:39 |
python |
Pythonタグが付けられた新着投稿 - Qiita |
Ubuntu18.04にtensorRTをインストールした |
https://qiita.com/studio_haneya/items/0a617694a3b6f232eb7a
|
UbuntuにtensorRTをインストールしたUbuntuにCUDAcuDNNをインストールしてtensorflowGPUを入れて使おうとしたところ、importtensorflowした際にlibnvinferが読めなかったと警告メッセージが出ました。 |
2020-05-30 02:37:51 |
python |
Pythonタグが付けられた新着投稿 - Qiita |
VScodeでPython (Windows10) |
https://qiita.com/nizu708/items/09b5d7785c31ceeeeea8
|
コンピュータは何も悪くないからねwだからこそAnacondajupyterNotebookがPythonの入りとしてはよいのだけどそれじゃ不満って人は先に進まないとね。 |
2020-05-30 02:26:13 |
python |
Pythonタグが付けられた新着投稿 - Qiita |
Djangoをインストールしてからhtmlページを表示するまでの手順 |
https://qiita.com/kkkei257/items/40fa58849125e389f466
|
pythonmanagepyrunserverをターミナル上で実行し、表示されたアドレスをブラウザからアクセスしてindexhtmlの内容が表示されていればOKです。 |
2020-05-30 02:01:45 |
js |
JavaScriptタグが付けられた新着投稿 - Qiita |
非同期のまえに同期処理を通してPromiseとasync/awaitを理解する |
https://qiita.com/basd4g/items/b1c96de727a53c4b4698
|
同期処理と非同期処理の違い同期処理でPromiseがどういう動作をするかこれからは「コールバック地獄を解決するためにPromiseチェーンがある」「asyncawaitはPromiseの生成だ」などと書かれた他の記事も読めるのではないだろうか。 |
2020-05-30 02:00:50 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
Macのターミナルを開くと毎回出てくるメッセージを消したい! |
https://teratail.com/questions/266033?rss=all
|
Macのターミナルを開くと毎回出てくるメッセージを消したい前提・実現したいこと毎回ターミナルを開くと以下のメッセージが出てくるので、消したいのですがどうしたいいのかわかりません。 |
2020-05-30 02:35:45 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
データのダウンロードができない |
https://teratail.com/questions/266032?rss=all
|
データのダウンロードができない前提・実現したいことデータをダウンロードしようとしたら以下の状態から動かなくなってしまいます。 |
2020-05-30 02:35:27 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
Visual Studio Code で改行時インデント自動除去を無効にしたい |
https://teratail.com/questions/266031?rss=all
|
VisualStudioCodeで改行時インデント自動除去を無効にしたい前提・実現したいことVisualnbspStudionbspCodenbspでインデントされた空行にカーソルを合わせた状態で改行するとそのインデントが勝手に除去されてしまいます。 |
2020-05-30 02:31:21 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
pandasで違う型を代入する方法が分からない「could not convert string to float「」 |
https://teratail.com/questions/266030?rss=all
|
pandasで違う型を代入する方法が分からない「couldnotconvertstringtofloat「」下記エラーが出ており、原因はfloatにobject型を代入しようとしたことが原因ぽいのですが、解決方法が分かりません。 |
2020-05-30 02:25:45 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
UnityのPUN2で描画処理をうまく同期させたい |
https://teratail.com/questions/266029?rss=all
|
PUNを使って同期処理の実装をしているのですが、描画の部分での同期がうまくいきません。 |
2020-05-30 02:18:27 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
コピーしたポインタ変数の解放処理 |
https://teratail.com/questions/266028?rss=all
|
コピーしたポインタ変数の解放処理voidnbspmainintnbsppnbspnbspnewnbspintintnbspinbspnbsppdeletenbspi上記のような処理で、iはdeleteが呼ばれたことでnbspcoutnbsp≪nbspinbsp≪nbspendlは参照出来ずエラーがおきますが、coutnbsp≪nbsppnbsp≪nbspendlはいままでどおりなかみを参照出来てしまいます。 |
2020-05-30 02:18:20 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
TextFild(3行)それぞれ入力した値を追加ボタンによって、tableviewへ3行反映させたい |
https://teratail.com/questions/266027?rss=all
|
TextFild行それぞれ入力した値を追加ボタンによって、tableviewへ行反映させたい上記イメージのように、行にそれぞれテキストを入力してTebleviewへ行ずつ並べるメモを作りたいですが、コードのつなぎ方を教えていただきたいです。 |
2020-05-30 02:06:36 |
Git |
Gitタグが付けられた新着投稿 - Qiita |
初学者によるGitの使い方メモ1 |
https://qiita.com/kensho_tk/items/20cb0eb7c757318b31b6
|
Gitコマンドgitを使用する上で一番最初にしなければならないのが、リポジトリの作成です。 |
2020-05-30 02:11:53 |
海外TECH |
Ars Technica |
YouTube makes video chapters official |
https://arstechnica.com/?p=1679802
|
youtube |
2020-05-29 17:38:18 |
海外TECH |
Ars Technica |
iPhone privacy prompts discriminate against non-Apple apps, complaint says |
https://arstechnica.com/?p=1679794
|
apple |
2020-05-29 17:28:06 |
海外TECH |
DEV Community |
🎞️ Auto-generate video summaries with a machine learning model and a serverless pipeline 🐍 |
https://dev.to/googlecloud/auto-generate-video-summaries-with-a-machine-learning-model-and-a-serverless-pipeline-324i
|
️Auto generate video summaries with a machine learning model and a serverless pipeline Source code PicardParis cherry on py Auto generate visual summaries of videos with a machine learning model and a serverless pipeline Hello Dear developers Do you like the adage a picture is worth a thousand words I do Let s check if it also works for a picture is worth a thousand frames In this tutorial you ll see the following how to understand the content of a video in a blink in less than lines of Python code Here is a visual summary example generated from a video made of sequences shots Note The summary is a grid where each cell is a frame representing a video shot ObjectivesThis tutorial has objectives practical and technical Automatically generate visual summaries of videosBuild a processing pipeline with these properties managed always ready and easy to set up scalable able to ingest several videos in parallel not costing anything when not used ️ToolsA few tools are enough Storage space for videos and resultsA serverless solution to run the codeA machine learning model to analyze videosA library to extract frames from videosA library to generate the visual summaries ArchitectureHere is a possible architecture using Google Cloud services Cloud Storage Cloud Functions and Video Intelligence API The processing pipeline follows these steps You upload a video to the st bucket a bucket is a storage space in the cloud The upload event automatically triggers the st functionThe function sends a request to the Video Intelligence API to detect the shotsThe Video Intelligence API analyzes the video and uploads the results annotations to the nd bucketThe upload event triggers the nd functionThe function downloads both annotation and video filesThe function renders and uploads the summary to the rd bucketThe video summary is ready Python librariesOpen source client libraries let you interface with Google Cloud services in idiomatic Python You ll use the following Cloud StorageTo manage downloads and uploadsVideo Intelligence APITo analyze videosHere is a choice of additional Python libraries for the graphical needs OpenCVTo extract video framesThere s even a headless version without GUI features which is ideal for a servicePillowTo generate the visual summariesPillow is a very popular imaging library both extensive and easy to use ️Project setupAssuming you have a Google Cloud account you can set up the architecture from Cloud Shell with the gcloud and gsutil commands This lets you script everything from scratch in a reproducible way Environment variables ProjectPROJECT NAME Visual Summary PROJECT ID visual summary REPLACE WITH UNIQUE SUFFIX Cloud Storage region GCS REGION europe west Cloud Functions region GCF REGION europe west SourceGIT REPO cherry on py PROJECT SRC PROJECT ID GIT REPO gcf video summary Cloud Storage buckets environment variables export VIDEO BUCKET b videos PROJECT ID export ANNOTATION BUCKET b annotations PROJECT ID export SUMMARY BUCKET b summaries PROJECT ID Note You can use your GitHub username as a unique suffix New projectgcloud projects create PROJECT ID name PROJECT NAME set as defaultCreate in progress for Waiting for operations cp to finish done Enabling service cloudapis googleapis com on project PROJECT ID Operation operations acf finished successfully Updated property core project to PROJECT ID Billing account Link project with billing account single account BILLING ACCOUNT gcloud beta billing accounts list format value name Link project with billing account specific one among multiple accounts BILLING ACCOUNT gcloud beta billing accounts list format value name filter displayName My Billing Account gcloud beta billing projects link PROJECT ID billing account BILLING ACCOUNTbillingAccountName billingAccounts XXXXXX YYYYYY ZZZZZZbillingEnabled truename projects PROJECT ID billingInfoprojectId PROJECT ID Buckets Create buckets with uniform bucket level accessgsutil mb b on c regional l GCS REGION gs VIDEO BUCKETgsutil mb b on c regional l GCS REGION gs ANNOTATION BUCKETgsutil mb b on c regional l GCS REGION gs SUMMARY BUCKETCreating gs VIDEO BUCKET Creating gs ANNOTATION BUCKET Creating gs SUMMARY BUCKET You can check how it looks like in the Cloud Console Service accountCreate a service account This is for development purposes only not needed for production This provides you with credentials to run your code locally mkdir PROJECT IDcd PROJECT IDSERVICE ACCOUNT NAME dev service account SERVICE ACCOUNT SERVICE ACCOUNT NAME PROJECT ID iam gserviceaccount com gcloud iam service accounts create SERVICE ACCOUNT NAMEgcloud iam service accounts keys create PROJECT ID key json iam account SERVICE ACCOUNTCreated service account SERVICE ACCOUNT NAME created key of type json as PROJECT ID key json for SERVICE ACCOUNT Set the GOOGLE APPLICATION CREDENTIALS environment variable and check that it points to the service account key When you run the application code in the current shell session client libraries will use these credentials for authentication If you open a new shell session set the variable again export GOOGLE APPLICATION CREDENTIALS PROJECT ID key jsoncat GOOGLE APPLICATION CREDENTIALS type service account project id PROJECT ID private key id private key BEGIN PRIVATE KEY n client email SERVICE ACCOUNT Authorize the service account to access the buckets IAM BINDING serviceAccount SERVICE ACCOUNT roles storage objectAdmin gsutil iam ch IAM BINDING gs VIDEO BUCKETgsutil iam ch IAM BINDING gs ANNOTATION BUCKETgsutil iam ch IAM BINDING gs SUMMARY BUCKET APIsA few APIs are enabled by default gcloud services listNAME TITLEbigquery googleapis com BigQuery APIbigquerystorage googleapis com BigQuery Storage APIcloudapis googleapis com Google Cloud APIsclouddebugger googleapis com Cloud Debugger APIcloudtrace googleapis com Cloud Trace APIdatastore googleapis com Cloud Datastore APIlogging googleapis com Cloud Logging APImonitoring googleapis com Cloud Monitoring APIservicemanagement googleapis com Service Management APIserviceusage googleapis com Service Usage APIsql component googleapis com Cloud SQLstorage api googleapis com Google Cloud Storage JSON APIstorage component googleapis com Cloud StorageEnable the Video Intelligence and Cloud Functions APIs gcloud services enable videointelligence googleapis com cloudfunctions googleapis comOperation operations acf finished successfully Source codeRetrieve the source code cd PROJECT IDgit clone GIT REPO gitCloning into GIT REPO Video analysis Video shot detectionThe Video Intelligence API is a pre trained machine learning model that can analyze videos One of the multiple features is video shot detection For the st Cloud Function here is a possible core function calling annotate video with the SHOT CHANGE DETECTION feature from google cloud import storage videointelligencedef launch shot detection video uri str annot bucket str Detect video shots asynchronous operation Results will be stored in lt annot uri gt with this naming convention video uri gs video bucket path to video ext annot uri gs annot bucket video bucket path to video ext json print f Launching shot detection for lt video uri gt video blob storage Blob from string video uri video bucket video blob bucket name path to video video blob name annot uri f gs annot bucket video bucket path to video json video client videointelligence VideoIntelligenceServiceClient features videointelligence enums Feature SHOT CHANGE DETECTION video client annotate video input uri video uri features features output uri annot uri Local development and testsBefore deploying the function you need to develop and test it Create a Python virtual environment and activate it cd PROJECT IDpython m venv venvsource venv bin activateInstall the dependencies pip install r PROJECT SRC gcf detect shots requirements txtCheck the dependencies pip listPackage Version google cloud storage google cloud videointelligence You can use the main scope to test the function in script mode import osANNOTATION BUCKET os getenv ANNOTATION BUCKET assert ANNOTATION BUCKET Undefined ANNOTATION BUCKET environment variable if name main Only for local tests import argparse parser argparse ArgumentParser parser add argument video uri type str help gs video bucket path to video ext args parser parse args launch shot detection args video uri ANNOTATION BUCKET Note You have already exported the ANNOTATION BUCKET environment variable earlier in the shell session you will also define it later at deployment stage This makes the code generic and lets you reuse it independently of the output bucket Test the function VIDEO PATH cloudmleap video next gbikes dinosaur mp VIDEO URI gs VIDEO PATH python PROJECT SRC gcf detect shots main py VIDEO URILaunching shot detection for lt gs cloudmleap video next gbikes dinosaur mp gt Note The test video lt gbikes dinosaur mp gt is located in an external bucket This works because the video is publicly accessible Wait a moment and check that the annotations have been generated gsutil ls r gs ANNOTATION BUCKET YYYY MM DDThh mm ssZ gs ANNOTATION BUCKET VIDEO PATH jsonTOTAL objects bytes B Check the last bytes of the annotation file gsutil cat r gs ANNOTATION BUCKET VIDEO PATH json start time offset seconds nanos end time offset seconds nanos Note Those are the start and end positions of the last video shot Everything seems fine Clean up when you re finished gsutil rm gs ANNOTATION BUCKET VIDEO PATH jsondeactivaterm rf venv Function entry pointdef gcf detect shots data context Cloud Function triggered by a new Cloud Storage object video bucket data bucket path to video data name video uri f gs video bucket path to video launch shot detection video uri ANNOTATION BUCKET Note This function will be called whenever a video is uploaded to the bucket defined as a trigger Function deploymentDeploy the st function GCF NAME gcf detect shots GCF SOURCE PROJECT SRC gcf detect shots GCF ENTRY POINT gcf detect shots GCF TRIGGER BUCKET VIDEO BUCKET GCF ENV VARS ANNOTATION BUCKET ANNOTATION BUCKET GCF MEMORY MB gcloud functions deploy GCF NAME runtime python source GCF SOURCE entry point GCF ENTRY POINT update env vars GCF ENV VARS trigger bucket GCF TRIGGER BUCKET region GCF REGION memory GCF MEMORY quietNote The default memory allocated for a Cloud Function is MB possible values are MB MB MB MB and MB As the function has no memory or CPU needs it sends a simple API request the minimum memory setting is enough Deploying function may take a while up to minutes done availableMemoryMb entryPoint gcf detect shotsenvironmentVariables ANNOTATION BUCKET b annotations eventTrigger eventType google storage object finalize status ACTIVEtimeout supdateTime YYYY MM DDThh mm ss mmmZ versionId Note The ANNOTATION BUCKET environment variable is defined with the update env vars flag Using an environment variable lets you deploy the exact same code with different trigger and output buckets Here is how it looks like in the Cloud Console Production testsMake sure to test the function in production Copy a video into the video bucket VIDEO NAME gbikes dinosaur mp SRC URI gs cloudmleap video next VIDEO NAME DST URI gs VIDEO BUCKET VIDEO NAME gsutil cp SRC URI DST URICopying gs cloudmleap video next gbikes dinosaur mp Content Type video mp files MiB MiB Operation completed over objects MiB Query the logs to check that the function has been triggered gcloud functions logs read region GCF REGIONLEVEL NAME EXECUTION ID TIME UTC LOGD gcf detect shots Function execution startedI gcf detect shots Launching shot detection for lt gs VIDEO BUCKET VIDEO NAME gt D gcf detect shots Function execution took ms finished with status ok Wait a moment and check the annotation bucket gsutil ls r gs ANNOTATION BUCKETYou should see the annotation file gs ANNOTATION BUCKET VIDEO BUCKET gs ANNOTATION BUCKET VIDEO BUCKET VIDEO NAME jsonThe st function is operational ️Visual Summary Code structureIt s interesting to split the code into main classes StorageHelper for local file and cloud storage object managementVideoProcessor for graphical processingsHere is a possible core function class VideoProcessor staticmethod def generate summary annot uri str output bucket str Generate a video summary from video shot annotations try with StorageHelper annot uri output bucket as storage with VideoProcessor storage as video proc print Generating summary image video proc render summary video proc upload summary as jpeg image except logging exception Could not generate summary from shot annotations lt s gt annot uri Note If exceptions are raised it s handy to log them with logging exception to get a stack trace in production logs Class StorageHelperThe class manages the following The retrieval and parsing of video shot annotationsThe download of source videosThe upload of generated visual summariesFile namesclass StorageHelper Local Cloud storage helper Uses a temp dir for local processing e g video frame extraction Paths are relative to this temp dir named after the output bucket Naming convention video uri gs video bucket path to video ext annot uri gs annot bucket video bucket path to video ext json video path video bucket path to video ext summary path video bucket path to video ext SUFFIX summary uri gs output bucket video bucket path to video ext SUFFIX client storage Client upload bucket storage Bucket shots VideoShots video path Path video local path Path ANNOT EXT json VideoShots List VideoShot def init self annot uri str output bucket str if not annot uri endswith self ANNOT EXT raise RuntimeError f annot uri must end with lt self ANNOT EXT gt self upload bucket self client bucket output bucket self shots self load annotations annot uri self video path self video path from uri annot uri temp root Path tempfile gettempdir output bucket temp root mkdir parents True exist ok True self video local path temp root joinpath self video path The source video is handled in the with statement context manager def enter self self download video return self def exit self exc type exc value traceback self video local path unlink Note Once downloaded the video uses memory space in the tmp RAM disk the only writable space for the serverless function It s best to delete temporary files when they re not needed anymore to avoid potential out of memory errors on future invocations of the function Annotations are retrieved with the methods storage Blob download as string and json loads def load annotations self annot uri str gt VideoShots json blob storage Blob from string annot uri self client api response json loads json blob download as string annotations Dict api response annotation results shot annotations return VideoShot from dict annotation for annotation in annotations The parsing is handled with this VideoShot helper class class VideoShot NamedTuple Video shot start end positions in nanoseconds pos ns int pos ns int NANOS PER SECOND classmethod def from dict cls annotation Dict gt VideoShot def time offset in ns time offset gt int seconds int time offset get seconds nanos int time offset get nanos return seconds cls NANOS PER SECOND nanos pos ns time offset in ns annotation start time offset pos ns time offset in ns annotation end time offset return cls pos ns pos ns Video shot info can be exposed with a getter and a generator def shot count self gt int return len self shots def gen video shots self gt Iterator VideoShot for video shot in self shots yield video shotThe naming convention was chosen to keep consistent object paths between the different buckets This also lets you deduce the video path from the annotation URI def video path from uri self annot uri str gt Path annot blob storage Blob from string annot uri return Path annot blob name len self ANNOT EXT The video is directly downloaded with storage Blob download to filename def download video self video uri f gs self video path as posix blob storage Blob from string video uri self client print f Downloading gt self video local path self video local path parent mkdir parents True exist ok True blob download to filename self video local path On the opposite results can be uploaded with storage Blob upload from string def upload summary self image bytes bytes image type str path self summary path image type blob self upload bucket blob path as posix content type f image image type print f Uploading gt blob name blob upload from string image bytes content type Note from string means from bytes here Python legacy Pillow supports working with memory images which avoids having to manage local files And finally here is a possible naming convention for the summary files def summary path self image type str gt Path video name self video path name shot count self shot count suffix f summary shot count d image type summary name f video name suffix return Path self video path parent summary name Class VideoProcessorThe class manages the following Video frame extractionVisual summary generationimport cv as cvfrom PIL import Imagefrom storage helper import StorageHelperclass VideoProcessor class ImageSize NamedTuple w int h int storage StorageHelper video cv VideoCapture cell size ImageSize grid size ImageSize def init self storage StorageHelper self storage storageOpening and closing the video is handled in the with statement context manager def enter self video path self storage video local path self video cv VideoCapture str video path if not self video isOpened raise RuntimeError f Could not open video lt video path gt self compute grid dimensions return self def exit self exc type exc value traceback self video release The video summary is a grid of cells which can be rendered in a single loop with two generators def render summary self shot ratio float gt Image grid img Image new RGB self grid size self RGB BACKGROUND img and pos iter zip self gen cell img shot ratio self gen cell pos for cell img cell pos in img and pos iter cell img thumbnail self cell size Make it smaller if needed grid img paste cell img cell pos return grid imgNote shot ratio is set to by default to extract video shot middle frames The first generator yields cell images def gen cell img self shot ratio float gt Iterator Image assert lt shot ratio lt MS IN NS for video shot in self storage gen video shots pos ns pos ns video shot pos ms pos ns shot ratio pos ns pos ns MS IN NS yield self image at pos pos ms The second generator yields cell positions def gen cell pos self gt Iterator Tuple int int cell x cell y while True yield cell x cell y cell x self cell size w if self grid size w lt cell x Move to next row cell x cell y cell y self cell size hOpenCV easily allows extracting video frames at a given position def image at pos self pos ms float gt Image self video set cv CAP PROP POS MSEC pos ms ok cv frame self video read if not ok raise RuntimeError f Failed to get video frame pos ms pos ms return Image fromarray cv cvtColor cv frame cv COLOR BGRRGB Choosing the summary grid composition is arbitrary Here is an example to compose a summary preserving the video proportions def compute grid dimensions self shot count self storage shot count if shot count lt raise RuntimeError f Expected video shots got shot count Try to preserve the video aspect ratio Consider cells as pixels and try to fit them in a square cols rows int shot count if cols rows lt shot count cols cell w int self video get cv CAP PROP FRAME WIDTH cell h int self video get cv CAP PROP FRAME HEIGHT if self SUMMARY MAX SIZE w lt cell w cols scale self SUMMARY MAX SIZE w cell w cols cell w int scale cell w cell h int scale cell h self cell size self ImageSize cell w cell h self grid size self ImageSize cell w cols cell h rows Finally Pillow gives full control on image serializations def upload summary as jpeg self image Image mem file BytesIO image type jpeg jpeg save parameters dict optimize True progressive True image save mem file format image type jpeg save parameters image bytes mem file getvalue self storage upload summary image bytes image type Note Working with in memory images avoids managing local files and uses less memory Local development and testsYou can use the main scope to test the function in script mode import osfrom video processor import VideoProcessorSUMMARY BUCKET os getenv SUMMARY BUCKET assert SUMMARY BUCKET Undefined SUMMARY BUCKET environment variable if name main Only for local tests import argparse parser argparse ArgumentParser parser add argument annot uri type str help gs annotation bucket path to video ext json args parser parse args VideoProcessor generate summary args annot uri SUMMARY BUCKET Test the function cd PROJECT IDpython m venv venvsource venv bin activatepip install r PROJECT SRC gcf generate summary requirements txtVIDEO NAME gbikes dinosaur mp ANNOTATION URI gs ANNOTATION BUCKET VIDEO BUCKET VIDEO NAME json python PROJECT SRC gcf generate summary main py ANNOTATION URIDownloading gt tmp SUMMARY BUCKET VIDEO BUCKET VIDEO NAMEGenerating summary Uploading gt VIDEO BUCKET VIDEO NAME summary jpegNote The uploaded video summary shows shots Clean up deactivaterm rf venv Function entry pointdef gcf generate summary data context Cloud Function triggered by a new Cloud Storage object annotation bucket data bucket path to annotation data name annot uri f gs annotation bucket path to annotation VideoProcessor generate summary annot uri SUMMARY BUCKET Note This function will be called whenever an annotation file is uploaded to the bucket defined as a trigger Function deploymentGCF NAME gcf generate summary GCF SOURCE PROJECT SRC gcf generate summary GCF ENTRY POINT gcf generate summary GCF TRIGGER BUCKET ANNOTATION BUCKET GCF ENV VARS SUMMARY BUCKET SUMMARY BUCKET GCF TIMEOUT s GCF MEMORY MB gcloud functions deploy GCF NAME runtime python source GCF SOURCE entry point GCF ENTRY POINT update env vars GCF ENV VARS trigger bucket GCF TRIGGER BUCKET region GCF REGION timeout GCF TIMEOUT memory GCF MEMORY quietNotes The default timeout for a Cloud Function is seconds As you re deploying a background function with potentially long processings set it to the maximum value seconds minutes You also need to bump up the memory a little for the video and image processings Depending on the size of your videos and the maximum resolution of your output summaries or if you need to generate the summary faster memory size and vCPU speed are correlated you might use a higher value MB or MB Deploying function may take a while up to minutes done availableMemoryMb entryPoint gcf generate summaryenvironmentVariables SUMMARY BUCKET b summaries status ACTIVEtimeout supdateTime YYYY MM DDThh mm ss mmmZ versionId Here is how it looks like in the Cloud Console Production testsMake sure to test the function in production You can upload an annotation file in the nd bucket VIDEO NAME gbikes dinosaur mp ANNOTATION FILE VIDEO NAME json ANNOTATION URI gs ANNOTATION BUCKET VIDEO BUCKET ANNOTATION FILE gsutil cp ANNOTATION URI gsutil cp ANNOTATION FILE ANNOTATION URIrm ANNOTATION FILENote This reuses the previous local test annotation file and overwrites it Overwriting a file in a bucket also triggers attached functions Wait a few seconds and query the logs to check that the function has been triggered gcloud functions logs read region GCF REGIONLEVEL NAME EXECUTION ID TIME UTC LOG D gcf generate summary Function execution startedI gcf generate summary Downloading gt tmp SUMMARY BUCKET VIDEO BUCKET VIDEO NAMEI gcf generate summary Generating summary I gcf generate summary Uploading gt VIDEO BUCKET VIDEO NAME summary jpegD gcf generate summary Function execution took ms finished with status ok The nd function is operational and the pipeline is in place You can now do end to end tests by copying new videos in the st bucket ResultsDownload the generated summary on your computer cd PROJECT IDgsutil cp r gs SUMMARY BUCKET jpeg cloudshell download jpegHere is the visual summary for gbikes dinosaur mp detected shots You can also directly preview the file from the Cloud Console Cherry on the Py Now the icing on the cake or the cherry on the pie as we say in French Based on the same architecture and code you can add a few features Trigger the processing for videos from other bucketsGenerate summaries in multiple formats such as JPEG PNG WEBP Generate animated summaries also in multiple formats such as GIF PNG WEBP Enrich the architecture to duplicate items The video shot detection function to get it to run as an HTTP endpointThe summary generation function to handle animated imagesAdapt the code to support the new features An animated parameter to generate still or animated summariesSave and upload the results in multiple formats Architecture v A Video shot detection can also be triggered manually with an HTTP GET requestB Still and animated summaries are generated in functions in parallelC Summaries are uploaded in multiple image formats HTTP entry pointdef gcf detect shots http request Cloud Function triggered by an HTTP GET request if request method GET return Please use a GET request if not request args or video uri not in request args return Please specify a video uri parameter video uri request args video uri launch shot detection video uri ANNOTATION BUCKET return f Launched shot detection for video uri lt video uri gt Note This is the same code as gcf detect shots with the video URI parameter provided from a GET request Function deploymentGCF NAME gcf detect shots http GCF SOURCE PROJECT SRC gcf detect shots GCF ENTRY POINT gcf detect shots http GCF TRIGGER BUCKET VIDEO BUCKET GCF ENV VARS ANNOTATION BUCKET ANNOTATION BUCKET GCF MEMORY MB gcloud functions deploy GCF NAME runtime python source GCF SOURCE entry point GCF ENTRY POINT update env vars GCF ENV VARS trigger http region GCF REGION memory GCF MEMORY quietHere is how it looks like in the Cloud Console Animation supportAdd an animated option in the core function class VideoProcessor staticmethod def generate summary annot uri str output bucket str animated False Generate a video summary from video shot annotations try with StorageHelper annot uri output bucket as storage with VideoProcessor storage as video proc print Generating summary image video proc render summary video proc upload summary as jpeg image if animated video proc generate summary animations else video proc generate summary stills except logging exception Could not generate summary from shot annotations lt s gt annot uri Define the formats you re interested in generating class ImageFormat See image format str save parameters Dict Make a copy if updated class ImageJpeg ImageFormat image format jpeg save parameters dict optimize True progressive True class ImageGif ImageFormat image format gif save parameters dict optimize True class ImagePng ImageFormat image format png save parameters dict optimize True class ImageWebP ImageFormat image format webp save parameters dict lossless False quality method SUMMARY STILL FORMATS ImageJpeg ImagePng ImageWebP SUMMARY ANIMATED FORMATS ImageGif ImagePng ImageWebP Add support to generate still and animated summaries in different formats def generate summary stills self image self render summary for image format in self SUMMARY STILL FORMATS self upload summary image image format def generate summary animations self frame count self ANIMATION FRAMES images for frame index in range frame count shot ratio frame index frame count print f shot ratio shot ratio image self render summary shot ratio images append image for image format in self SUMMARY ANIMATED FORMATS self upload summary images image format The serialization can still take place in a single function def upload summary self images List Image image format Type ImageFormat if not images raise RuntimeError Empty image list mem file BytesIO image type image format image format save parameters dict image format save parameters Copy animated lt len images if animated save parameters update dict save all True append images images duration self ANIMATION FRAME DURATION MS loop Infinite loop images save mem file format image type save parameters image bytes mem file getvalue self storage upload summary image bytes image type animated Note Pillow is both versatile and consistent allowing for significant and clean code factorization Add an animated optional parameter to the StorageHelper class class StorageHelper def upload summary self image bytes bytes image type str path self summary path image type def upload summary self image bytes bytes image type str animated False path self summary path image type animated blob self upload bucket blob path as posix content type f image image type print f Uploading gt blob name blob upload from string image bytes content type def summary path self image type str animated False gt Path video name self video path name shot count self shot count suffix f summary shot count d image type still or anim anim if animated else still suffix f summary shot count d still or anim image type summary name f video name suffix return Path self video path parent summary name And finally add an ANIMATED optional environment variable in the entry point ANIMATED os getenv ANIMATED def gcf generate summary data context VideoProcessor generate summary annot uri SUMMARY BUCKET VideoProcessor generate summary annot uri SUMMARY BUCKET ANIMATED if name main VideoProcessor generate summary args annot uri SUMMARY BUCKET VideoProcessor generate summary args annot uri SUMMARY BUCKET ANIMATED Function deploymentDuplicate the nd function with the additional ANIMATED environment variable GCF NAME gcf generate summary animated GCF SOURCE PROJECT SRC gcf generate summary GCF ENTRY POINT gcf generate summary GCF TRIGGER BUCKET ANNOTATION BUCKET GCF ENV VARS SUMMARY BUCKET SUMMARY BUCKET GCF ENV VARS ANIMATED GCF TIMEOUT s GCF MEMORY MB gcloud functions deploy GCF NAME runtime python source GCF SOURCE entry point GCF ENTRY POINT update env vars GCF ENV VARS update env vars GCF ENV VARS trigger bucket GCF TRIGGER BUCKET region GCF REGION timeout GCF TIMEOUT memory GCF MEMORY quietHere is how it looks like in the Cloud Console Final testsThe HTTP endpoint lets you trigger the pipeline with a GET request GCF NAME gcf detect shots http VIDEO URI gs cloudmleap video next visionapi mp GCF URL https GCF REGION PROJECT ID cloudfunctions net GCF NAME video uri VIDEO URI curl GCF URL H Authorization bearer gcloud auth print identity token Launched shot detection for video uri lt VIDEO URI gt Note The test video lt visionapi mp gt is located in an external bucket but is publicly accessible In addition copy one or several videos into the video bucket You can drag and drop videos The videos are then processed in parallel Here are a few logs LEVEL NAME EXECUTION ID LOG D gcf generate summary animated fntslsfwdu Function execution took ms finished with status ok I gcf generate summary ydvqabafn Uploading gt b videos JaneGoodall mp summary still pngI gcf generate summary animated qvbjjk shot ratio I gcf generate summary ydvqabafn Uploading gt b videos JaneGoodall mp summary still webpD gcf generate summary ydvqabafn Function execution took ms finished with status ok I gcf generate summary animated gdwrzxzst shot ratio D gcf generate summary amwmovwkgn Function execution took ms finished with status ok I gcf generate summary animated ppfzx shot ratio I gcf generate summary animated iuhsjzr Uploading gt b videos JaneGoodall mp summary anim pngI gcf generate summary animated iuhsjzr Uploading gt b videos JaneGoodall mp summary anim webpD gcf generate summary animated iuhsjzr Function execution took ms finished with status ok In the rd bucket you ll find all still and animated summaries You ve already seen the still summary for lt JaneGoodall mp gt as an introduction to this tutorial In the animated version and in only frames you get an even better idea of what the whole video is about If you don t want to keep your project you can delete it gcloud projects delete PROJECT ID One more thingfirst line after licence find PROJECT SRC name py exec tail n first line after licence grep v wc lYou did everything in under lines of Python Less lines less bugs Mission accomplished See youI hope you appreciated this tutorial and would love to read your feedback You can also follow me on Twitter |
2020-05-29 17:28:11 |
Apple |
AppleInsider - Frontpage News |
Beats announces four new bright Powerbeats Pro colors, coming June 9 |
https://appleinsider.com/articles/20/05/29/beats-announces-four-new-bright-powerbeats-pro-colors-coming-june-9
|
Beats announces four new bright Powerbeats Pro colors coming June Apple s Beats brand on Friday is announcing four new color options for its Powerbeats Pro wireless headphones Spring Yellow Cloud Pink Lava Red and Glacier Blue |
2020-05-29 17:41:19 |
Apple |
AppleInsider - Frontpage News |
What to expect inside reopened Apple Stores in the coronavirus era |
https://appleinsider.com/articles/20/05/28/what-to-expect-inside-reopened-apple-stores-in-the-coronavirus-era
|
What to expect inside reopened Apple Stores in the coronavirus eraApple is in the process of reopening U S retail stores and as expected those outlets look a lot different post coronavirus Here s what we ve seen in store and what you should expect if you need ーor want ーto make a trip |
2020-05-29 17:20:36 |
海外TECH |
Engadget |
The best deals we found this week: AirPods Pro, Fire TV Cube and more |
https://www.engadget.com/weekly-deals-apple-airpods-pro-amazon-fire-tv-cube-174040116.html
|
The best deals we found this week AirPods Pro Fire TV Cube and moreMemorial Day may be behind us but you can still grab some of its deals Apple s AirPods Pro are still at the very good sale price of at Amazon and a bunch of Fire TV devices are on sale as well The Fire TV Cube is right now and both the F |
2020-05-29 17:40:40 |
海外TECH |
Engadget |
Is there a good reason to buy the Apple Watch Series 5 ? |
https://www.engadget.com/apple-watch-series-5-user-reviews-wanted-171506324.html
|
Is there a good reason to buy the Apple Watch Series It s been five years since Apple debuted its smartwatch and in that time the square faced device has become iconic and infamous The most recent version offers dimmable faces that keep the screen available at all times courtesy of its low tempera |
2020-05-29 17:15:06 |
海外科学 |
NYT > Science |
Live Coronavirus News and Updates |
https://www.nytimes.com/2020/05/29/us/coronavirus-us-usa.html
|
Live Coronavirus News and UpdatesMore than scientists are questioning an influential study on the use of malaria drugs to treat Covid Republicans and North Carolina s governor trade demands over the planned G O P convention |
2020-05-29 17:47:33 |
海外科学 |
NYT > Science |
Coronavirus Live World Updates: Moscow, Spain, Israel |
https://www.nytimes.com/2020/05/29/world/coronavirus-update.html
|
israel |
2020-05-29 17:51:26 |
海外ニュース |
Japan Times latest articles |
Tokyo nears phase two of virus recovery plan as Osaka to fully reopen |
https://www.japantimes.co.jp/news/2020/05/29/national/coronavirus-tokyo-phase-two-osaka-reopen/
|
Tokyo nears phase two of virus recovery plan as Osaka to fully reopenAs business minded Osaka prepares to go full bore with Tokyo trailing close behind infections continue to fluctuate in the capital |
2020-05-30 02:23:35 |
海外ニュース |
Japan Times latest articles |
J. League first division to resume on July 4 |
https://www.japantimes.co.jp/sports/2020/05/29/soccer/j-league/j-league-resumes-july/
|
initial |
2020-05-30 03:49:13 |
海外ニュース |
Japan Times latest articles |
Photo essay: Tokyo without tourists |
https://www.japantimes.co.jp/life/2020/05/29/travel/tokyo-without-tourists/
|
tokyo |
2020-05-30 04:00:02 |
海外ニュース |
Japan Times latest articles |
Japan is going for broke, minus the broke part |
https://www.japantimes.co.jp/opinion/2020/05/29/commentary/japan-commentary/japan-going-broke-minus-broke-part/
|
central |
2020-05-30 04:00:15 |
ニュース |
BBC News - Home |
Coronavirus furlough scheme to finish at end of October, says chancellor |
https://www.bbc.co.uk/news/business-52853333
|
august |
2020-05-29 17:30:13 |
ニュース |
BBC News - Home |
George Floyd death: Ex-officer held in Minneapolis |
https://www.bbc.co.uk/news/world-us-canada-52854025
|
floyd |
2020-05-29 17:57:30 |
ニュース |
BBC News - Home |
Coronavirus: Evening update as firms to start paying share of furlough scheme |
https://www.bbc.co.uk/news/uk-52853273
|
outbreak |
2020-05-29 17:49:33 |
ニュース |
BBC News - Home |
Coronavirus: Relaxing lockdown 'risky' and 'political decision' |
https://www.bbc.co.uk/news/health-52849691
|
levels |
2020-05-29 17:34:46 |
ニュース |
BBC News - Home |
Coronavirus: What does it mean if I've been furloughed by work? |
https://www.bbc.co.uk/news/explainers-52135342
|
scheme |
2020-05-29 17:42:21 |
ニュース |
BBC News - Home |
Coronavirus UK map: How many confirmed cases are there in your area? |
https://www.bbc.co.uk/news/uk-51768274
|
government |
2020-05-29 17:19:47 |
ニュース |
BBC News - Home |
Coronavirus pandemic: Tracking the global outbreak |
https://www.bbc.co.uk/news/world-51235105
|
respiratory |
2020-05-29 17:24:38 |
Azure |
Azure の更新情報 |
Key Vault bring your own key (BYOK) is now generally available |
https://azure.microsoft.com/ja-jp/updates/akv-byok-ga/
|
Key Vault bring your own key BYOK is now generally availableA new bring your own key BYOK method to import keys securely from on premises HSMs into Azure Key Vault is now generally available This BYOK method can be used to import keys from any supported on premises HSM |
2020-05-29 17:00:29 |
コメント
コメントを投稿