投稿時間:2022-04-08 12:18:32 RSSフィード2022-04-08 12:00 分まとめ(26件)

カテゴリー等 サイト名等 記事タイトル・トレンドワード等 リンクURL 頻出ワード・要約等/検索ボリューム 登録日
IT ITmedia 総合記事一覧 [ITmedia ビジネスオンライン] AOKI、“パジャマスーツ”に合う“シューズ”が好調 追加生産を決定 https://www.itmedia.co.jp/business/articles/2204/08/news097.html itmedia 2022-04-08 11:35:00
IT ITmedia 総合記事一覧 [ITmedia ビジネスオンライン] 就活に有効だと思える「ガクチカ」 2位は「インターンでの経験」、1位は? https://www.itmedia.co.jp/business/articles/2204/08/news098.html itmedia 2022-04-08 11:34:00
IT ITmedia 総合記事一覧 [ITmedia ビジネスオンライン] ワークマン、「ファン付きウェア」を拡充 アウトドアや普段使いできる製品を投入 https://www.itmedia.co.jp/business/articles/2204/08/news051.html itmedia 2022-04-08 11:10:00
TECH Techable(テッカブル) イタリア製レザーで高級感をプラス! SpigenがiPhone 13用ケース「Enzo」発売 https://techable.jp/archives/176683 iphone 2022-04-08 02:00:55
AWS AWS Japan Blog AWS WAF Bot Control のチューニングと最適化 https://aws.amazon.com/jp/blogs/news/fine-tune-and-optimize-aws-waf-bot-control-mitigation-capability/ 本記事では、密接に関連する機能を明らかにし、重要な考慮事項、ベストプラクティス、および一般的なユースケースのためのカスタマイズ方法について説明します。 2022-04-08 02:11:33
Ruby Rubyタグが付けられた新着投稿 - Qiita listen (LoadError) https://qiita.com/ro-ze1106/items/c445fa926791b11b8ffc loaderr 2022-04-08 11:50:42
AWS AWSタグが付けられた新着投稿 - Qiita 【日次レポート機能実装編】開発未経験がクラウドを企業に導入するプロジェクトごっこをしてみた~その7~ https://qiita.com/Ishii_Taiki/items/4f5dc6002624a4684168 awssap 2022-04-08 11:12:51
Docker dockerタグが付けられた新着投稿 - Qiita Dockerについて自分なりにまとめてみた。 https://qiita.com/tanakanata7190/items/a1bccd5bdbf85b7c4d95 docker 2022-04-08 11:31:51
Docker dockerタグが付けられた新着投稿 - Qiita Hyperledger Iroha 環境構築 https://qiita.com/ShunichiMurata/items/4be2b9f1a67aae1dc278 userdevelopmentdock 2022-04-08 11:19:27
golang Goタグが付けられた新着投稿 - Qiita goenvとさよならして、goのバージョンを使い分ける https://qiita.com/Vermee81/items/d0b922f4428ad4fb2518 goenv 2022-04-08 11:16:26
海外TECH DEV Community What is GitHub Actions? A not-so-ELI5 introduction in 2022 ⚙️ https://dev.to/hunghvu/what-is-github-actions-a-not-so-eli5-introduction-in-2022-3ph What is GitHub Actions A not so ELI introduction in ️ What is GitHub Actions I guess you are familiar with the GitHub at this point Is it more than a version control system to you As you can see in the image there is an Actions tab which leads to the GitHub Actions feature What is it you may ask GitHub Actions is an automation platform that contains workflows and the workflows can be triggered manually on schedule or based on different events It can create a CI CD pipeline to build test and deploy your codebase However as a mature automation platform it can run arbitrary system commands so its capability is NOT limited to a CI CD pipeline In my previous article Advanced GitHub Actions Conditional Workflow I dived into the advanced feature This time let s start from the ground and discussing the basic concepts of GitHub Actions GitHub Actions anatomy GitHub Actions consists of core elements EventWorkflowRunnerJobActionHow are they related An event triggers a workflow which contains multiple jobs A runner executes a job and one job can have multiple actions By the way I m not certain why one component is named action while the platform is already named GitHub Actions that just sounds confusing ‍ ️ What is an event Whenever something happens it is an event literally You may wonder what can be considered events in GitHub Actions and here we have types of events Repository eventsThese are the events within the scope of your repository Some notable examples are creating a pull request and merging code It can go to the atomic level of how you use GitHub on a daily basis You can take a look at GitHub Docs to learn more about available events and their usage External eventsThese are what happens outside of your repository By using a webhook event called repository dispatch you can trigger a specific workflow What is a workflow This is something big that you want to achieve a configurable automated process defined by a yaml file that resides in the github workflows folder Do you want an end to end test process It can be named ee workflow An example of the MUI repository below might give you a better visualization of workflow feature What is a runner It is a virtual machine Linux Windows Mac with an application called GitHub Actions Runner running A runner will run a job when a workflow is triggered The runners can be either self hosted for customizability or GitHub hosted A job can specify its runner using a runs on property As each job uses its own runner fresh virtual machine their environment data is isolated by default However you can share the data by explicitly passing variables or results between them Note I find the definition of a runner to be confusing In some places its purpose is defined as to run a job however other places say it runs a workflow I believe they mean the same thing as defined above but you can keep this in mind and double check if necessary The runner is the application that runs a job from a GitHub Actions workflow It is used by GitHub Actions in the hosted virtual environments or you can self host the runner in your own environment GitHub Actions Runner README A runner is a server that runs your workflows when they re triggered GitHub Docs What is a job A job consists of sequential and skippable steps You can think the job as a group of tasks to do in order to achieve a goal workflow Each step defines a specific operation to be executed be it a script an action or arbitrary system commands Due to a sequential nature a step is dependent on the previous one If a previous step modifies an environment then the later steps are affected by the changes If one step fails or execute an exit command with a failed signal the workflow will be cancelled What is an action If a job is a group of tasks then an action is a task An action is a repetitive and complex in nature that is created using languages that can be compiled to JavaScript or using bash script in a container There is a whole marketplace that you can a suitable action in there Wrap up Hopefully this helps you learn more about GitHub Actions This article is only a brief introduction to this massive feature You can learn more about this by using an official GitHub Docs That said I think I will write some more articles to have a deeper dive into this feature 2022-04-08 02:46:38
海外TECH DEV Community Asynchronous Preloading in SpriteKit with Swift https://dev.to/johansteen/asynchronous-preloading-in-spritekit-with-swift-5h8m Asynchronous Preloading in SpriteKit with SwiftThis is the second and final part of how to handle asynchronous preloading of SpriteKit game assets with Swift and an OperationQueue If you haven t already read the previous Asynchronous Operations in Swift article I d suggest reading it first to be comfortable with the basics regarding using an OperationQueue in Swift Preloading in SpriteKitWe will use a protocol that objects can adopt when requiring preloading of assets which our preloader operation in turn will call We re keeping this example simple with one static method protocol AssetPreloading class static func preloadAssets withCompletionHandler completionHandler escaping gt Void When adopting this protocol we will use the preloadAssets method to load and decode all textures into memory This would also be a great place to load and compile any pixel fragment shaders that we might be using for the entity So let s look at an example how we would use this protocol for an entity in the game class Enemy GKEntity static var textures SKTexture extension Enemy AssetPreloading static func preloadAssets withCompletionHandler completionHandler escaping gt Void SKTextureAtlas preloadTextureAtlasesNamed Enemy error atlases in if let error error fatalError Texture atlases could not be found error Decode the textures from the sprite atlas here ie something like textures atlases textureNamed enemy atlases textureNamed enemy Then we must call the completion handler to let the preloader know that this Operation has completed completionHandler In this example we have a GKEntity defining an Enemy which adopts the AssetPreloading protocol We define a static property in the enemy where we will store all the decoded textures As it is a static property all enemy instances being created will share these textures keeping the memory footprint to a minimum Then in the preloadAssets method we are doing the actual preloading using the preloadTextureAtlasesNamed method available in SpriteKit and finally in the completion closure of that method we decode each texture from the atlas into memory If we are not using sprite atlases but just plain textures SKTexture has a preload method that works similar To keep it simple for this article we just decode the textures one by one from the sprite atlas and store in the static array In a real scenario we would probably follow a naming scheme so we can automatically loop through the assets in the sprite atlas while decoding and assign them identifiers or group them if we are having multiple animations in an atlas Anyway once the decoding of the textures are done we call the completionHandler method This one is passed in by the Operation and let the OperationQueue know that the preloading for this asset has been completed This is important otherwise the OperationQueue will never finish Combining with OperationNow we have an entity with a method containing all the logic to preload and decode assets into memory Next we are going to need an Operation that can call the preloadAssets method Referring back to the first article I wrote about asynchronous operations in Swift the Operation class will be pretty much identical to the example shown under the Full Implementation section So instead duplicating all that code I ll just show the additions class PreloadOperation Operation let preloader AssetPreloading Type init preloader AssetPreloading Type self preloader preloader super init override func start Check for cancel and other possible logic state isExecuting preloader preloadAssets unowned self in self state isFinished So what has happened here since the first article is that we now have a constructor The init preloader AssetPreloading Type takes a reference to an entity that conforms to the AssetPreloading protocol so we can later call it once the Operation starts Then we have added the actual call to preloadAssets in the start method of the operation Here we also pass in the completionHandler closure that marks the state of the operation as isFinished Setting up the Operation QueueWith the actual preloading functionality and the Operation class in place we now need to setup an OperationQueue where we can add the PreloadOperations To stay focused on the subject we ll use an array to track the objects that has assets that needs to be preloaded let preloaders AssetPreloading Type Player self Enemy self Turret self In a bigger project with multiple levels using unique and different assets we would most likely generate this array from a parser of level data to figure out which assets that the level uses and which of those assets that complies to the AssetPreloading protocol The OperationQueue is multithreaded so we have no way of knowing in which order operations are finished which in this case doesn t really matter We need to know when all the operations are finished though and we will use the dependency functionality of the OperationQueue to track that An Operation can depend on one or many other Operations That boils down to that an Operation won t start until all other Operations it depends on have finished We can use that to track when all Operations are done by having one final Operation with the purpose to notify when the OperationQueue is done This completedOperation will have every other Operation added as a dependency to guarantee it doesn t run until every previous Operation has finished So we will set up the OperationQueue like this let operationQueue OperationQueue let completedOperation BlockOperation DispatchQueue main async unowned self in All preloading done Game logic can continue to next state for preloader in preloaders let preloadOperation PreloadOperation preloader preloader completedOperation addDependency preloadOperation operationQueue addOperation preloadOperation operationQueue addOperation completedOperation First we setup the completedOperation We don t need to subclass Operation for this as it s behavior is simple enough to use the system defined BlockOperation subclass Then we simply loop through all objects that has a preloader and create an Operation for it We add each new operation as a dependency to the completedOperation with addDependency to ensure the completedOperation run last And finally when all Operations are added to the queue we add the completedOperation to the queue and that my friends should make us happy campers Leveraging the State MachineWe re getting closer to complete the implementation the missing piece to finish the puzzle is that the completedOperation that are run last in the OperationQueue needs to progress the game to the next state That can be done in multiple ways Simply just calling a method to progress to next scene could be good enough in many scenarios using the NotificationCenter is another option or the solution we will opt for using GameplayKit s finite GKStateMachine We re going to use a state machine even though we in this example only are using two states as it leaves room for adding additional states in the future Additional states could be showing a loading screen updating the progress on the loading screen as well as a state for purging loaded assets from memory In this example we ll just implement the states needed to get preloading of the game assets up and running We re going to create an AssetLoader class that we will call whenever we need to preload assets class AssetLoader lazy var stateMachine GKStateMachine states AssetLoaderLoadingState assetLoader self AssetLoaderReadyState assetLoader self let assetsToLoad AssetPreloading Type init assetsToLoad AssetPreloading Type self assetsToLoad assetsToLoad stateMachine enter AssetLoaderLoadingState self This class takes an array of objects that implements the AssetPreloading protocol as well as it sets up the state machine with two possible states AssetLoaderLoadingState and AssetLoaderReadyState When we instance the class we immediately enter the loading state to begin the preloading The loading state will run the OperationQueue code from the previous section in the didEnter method class AssetLoaderLoadingState GKState unowned let assetLoader AssetLoader init assetLoader AssetLoader self assetLoader assetLoader super init override func didEnter from previousState GKState The OperationQueue code from the previous section The completedOperation BlockOperation in the operation queue will enter the ready state once all preloading is done so we ll update the completedOperation to look like this let completedOperation BlockOperation DispatchQueue main async unowned self in self stateMachine enter AssetLoaderReadyState self And then finally the ready state Here we will use the didEnter method to continue the game logic by moving on to the next state which probably would be to present the next level for the player class AssetLoaderReadyState GKState unowned let assetLoader AssetLoader init assetLoader AssetLoader self assetLoader assetLoader super init override func didEnter from previousState GKState Call code to enter the next state of the game And it s a wrap we have now successfully implemented a preloader using multithreading that can easily be adapted to most games with the need to preload assets ConclusionBy combining these different frameworks SpriteKit OperationQueue and GameplayKit we end up with a very solid and robust solution that has plenty of room to grow when the game s complexity and number of assets increases It is also an implementation that is maintainable and by having the overall logic wrapped in a state machine there will be no hindering to add additional features like a loading progress bar 2022-04-08 02:04:37
海外科学 BBC News - Science & Environment Rejuvenation of woman's skin could tackle diseases of aging https://www.bbc.co.uk/news/science-environment-60991675?at_medium=RSS&at_campaign=KARANGA applications 2022-04-08 02:22:07
金融 生命保険協会 「高齢者支援・健康増進施策推進団体に対する助成活動」目録授与式を開催しました(山口県協会) https://www.seiho.or.jp/info/social/2022/cr_20220408_2.html 高齢者 2022-04-08 12:00:00
金融 生命保険協会 介護福祉士養成給付型奨学生に対する卒業記念品の授与を行いました (滋賀県協会) https://www.seiho.or.jp/info/social/2022/cr_20220408_1.html 介護福祉士 2022-04-08 12:00:00
金融 生命保険協会 地域福祉の向上に寄与したとして感謝状を拝受しました(佐賀県協会) https://www.seiho.or.jp/info/social/2022/cr_20220408_3.html 地域福祉 2022-04-08 12:00:00
ニュース ジェトロ ビジネスニュース(通商弘報) ジャカルタでも観光用の到着ビザ発給開始、日本含む43カ国が対象 https://www.jetro.go.jp/biznews/2022/04/4af79d6dde8aef57.html 開始 2022-04-08 02:10:00
北海道 北海道新聞 平和賞のロシア紙編集長に塗料 目を負傷、独立系新聞 https://www.hokkaido-np.co.jp/article/667142/ 負傷 2022-04-08 11:19:00
北海道 北海道新聞 国史跡「床机石」にペンキ 愛知・長久手、家康が軍議 https://www.hokkaido-np.co.jp/article/667141/ 愛知県長久手市 2022-04-08 11:18:00
北海道 北海道新聞 春の訪れ、水流伝える 稚内北辰ダムで「オーバーフロー」 https://www.hokkaido-np.co.jp/article/666952/ 春の訪れ 2022-04-08 11:16:11
北海道 北海道新聞 道内新車販売3月、前年比16%減 9カ月連続前年割れ https://www.hokkaido-np.co.jp/article/667136/ 前年割れ 2022-04-08 11:09:00
北海道 北海道新聞 台風1号が発生 太平洋・カロリン諸島付近 https://www.hokkaido-np.co.jp/article/667135/ 熱帯低気圧 2022-04-08 11:05:00
ビジネス 東洋経済オンライン 「不動産投資でだまされる人」が知らない怖い裏側 漫画「正直不動産」(第7集 第49話) | 正直不動産 | 東洋経済オンライン https://toyokeizai.net/articles/-/540109?utm_source=rss&utm_medium=http&utm_campaign=link_back 不動産会社 2022-04-08 11:30:00
IT 週刊アスキー 「FFオリジン」を4Kプレイも!Core i7-12700+RTX 3080で1つ上を目指す人にオススメな「PG-PD12」 https://weekly.ascii.jp/elem/000/004/087/4087814/ corei 2022-04-08 11:45:00
IT 週刊アスキー 爽やかな味わいのパフェやドリンクを楽しもう! ルミネ新宿「一〇八抹茶茶廊」初夏をテーマにしたオリジナルパフェなどを4月14日より提供 https://weekly.ascii.jp/elem/000/004/088/4088765/ 爽やか 2022-04-08 11:30:00
GCP Cloud Blog Securely exchange data and analytics assets at scale with Analytics Hub, now available in Preview https://cloud.google.com/blog/products/data-analytics/analytics-hub-data-exchange-now-in-public-preview/ Securely exchange data and analytics assets at scale with Analytics Hub now available in PreviewNow more than ever organizations rely on real time and accurate data analytics solutions to drive innovation and achieve operational excellence With the help of AI and machine learning companies combine data from various sources to derive actionable insights forecast outcomes and make more informed decisions Many organizations are also looking for new ways to monetize their data assets and provide users with secure access to data outside their ecosystem  However with security threats and privacy regulations on the rise companies find it difficult to securely share or exchange data with partners customers or other organizations Traditional data sharing techniques such as batch ETL pipelines or FTP downloads are costly to maintain do not scale well and cause fragmented or stale data To help organizations overcome these data sharing challenges we are excited today to announce the preview of Analytics Hub This new fully managed service enables you to efficiently and securely exchange valuable data and analytics assets across organizational boundaries Analytics Hub is built on top of BigQuery ​​Google s petabyte scale serverless cloud data warehouse BigQuery s unique architecture enables you to share data at scale without making multiple copies of your data Data is always live and can be consumed in real time using the built in streaming capabilities With BigQuery you can also leverage the built in machine learning geospatial and advanced analytics capabilities or take advantage of the native business intelligence support with tools like Looker Google Sheets and Data Studio Data sharing is not new to BigQuery In fact we have had cross organizational in place data sharing capabilities since As of November we see over different organizations sharing over petabytes of data per week in BigQuery Analytics Hub makes data sharing even easier enabling organizations to realize the full potential of their shared data Here is what some of our early adopters have to say “As external data becomes more critical to organizations across industries the need for a unified experience between data integration and analytics has never been more important We are proud to be working with Google Cloud to power the launch of Analytics Hub feeding hundreds of pre engineered data pipelines from hundreds of external datasets saidDan Lynn SVP Product at Crux “The sharing capabilities that Analytics Hub delivers will significantly enhance the data mobility requirements of practitioners and the Crux data integration platform stands ready to quickly integrate any external data source and deliver on behalf of Google Cloud and its clients GCP has enabled Universal Film to transform towards an agile and experimental approach to our data science and analytics efforts underscored with high availability and security The addition of Analytics Hub when combined with BigQuery has facilitated a rapid scale out of our data driven outputs across marketing teams globally and a streamlined data sharing process with our data partners all with minimal DataOps overhead said Chris Massey SVP Global Data Strategy amp Transformation at NBC Universal “Data and analytics can change the world from helping companies navigate the complexities of today s ever changing business environment to identifying ways we can all contribute to a more sustainable future We are excited at the prospect of integrating our depth and breadth of data into Google Cloud s Analytics Hub giving more customers around the world access to the trusted data they need to identify risks before they happen and uncover opportunities for future growth said Gary Kotovets Chief Data Officer at Dun amp Bradstreet Neustar serves of Fortune brands offering identity resolution audience targeting and measurement for the most sophisticated marketing operations Our partnership with Google delivers advanced capabilities directly to our shared customers via Analytics Hub said Ryan Engle VP of Product Management at Neustar a TransUnion company “To realise the transition towards a more sustainable net zero future it s crucial for investors and businesses to have access to comparable and reliable sustainability information Google Cloud s cutting edge infrastructure enables ESG Book to offer a digitised and streamlined approach to ESG data and unlock both value and actionable real time insights We are excited to be collaborating with Google Cloud and leverage its market leading Analytics Hub said Dr Daniel Klier CEO at ESG Book Arabesque Secure data exchangesAnalytics Hub makes it easy for organizations to govern and distribute their data centrally As a data publisher you can create secure data exchanges and publish listings that contain the datasets you want to deliver to your subscribers By default exchanges are completely private Only the users or groups you provision can view or subscribe to the listings You also have the ability to make your exchange public providing all Google Cloud customers access to view or subscribe to your listings You can easily create multiple exchanges to meet your data sharing needs  As a data subscriber Analytics Hub provides a seamless experience for you to browse and search through all the listings across all exchanges that you have access to Once you find the dataset of interest you subscribe to the listing This will create a read only linked dataset within your project where you can query or perform analytics on the data The linked dataset is not a copy of the data it is just a symbolic link to the shared dataset and stays in sync with any changes made by the data publisher Bootstrapping our data ecosystemOrganizations have increasingly started to consume data from third party sources and complement them with internal data to derive new insights With Analytics Hub we want to make it easy for you to discover and subscribe datasets to trusted valuable sources On day one you can subscribe to Public datasets Easy access to hundreds of public datasets managed by Google that include data about weather and climate cryptocurrency healthcare and life sciences and transportation  Google datasets Unique freely available first party datasets from Google An example of this is the Google Trends dataset that allows users to measure interest in a particular topic or search term across Google Search from around the United States down to the city level and now also available for international markets Commercial datasets In partnership with Crux Informatics we have onboarded commercial providers across finance geospatial retail etc who have brought all their data into BigQuery We are excited to feature business data and insights from our recent strategic agreement with Dun amp Bradstreet Next stepsGet started with Analytics Hub today by using this guide starting a free trial with BigQuery or contacting the Google Cloud sales team Stay tuned for future feature updates such as usage metrics for publishers parametrized datasets privacy safe queries commercialization management and much more Related ArticleEnhance your analysis with new international Google Trends datasets in BigQueryInternational Google Trends datasets now available in BigQuery to accelerate business insights Read Article 2022-04-08 03:00:00

コメント

このブログの人気の投稿

投稿時間:2021-06-17 22:08:45 RSSフィード2021-06-17 22:00 分まとめ(2089件)

投稿時間:2021-06-20 02:06:12 RSSフィード2021-06-20 02:00 分まとめ(3871件)

投稿時間:2021-06-17 05:05:34 RSSフィード2021-06-17 05:00 分まとめ(1274件)