AWS |
AWS Architecture Blog |
Let’s Architect! Optimizing the cost of your architecture |
https://aws.amazon.com/blogs/architecture/lets-architect-optimizing-the-cost-of-your-architecture/
|
Let s Architect Optimizing the cost of your architectureWritten in collaboration with Ben Moses AWS Senior Solutions Architect and Michael Holtby AWS Senior Manager Solutions Architecture Designing an architecture is not a simple task There are many dimensions and characteristics of a solution to consider such as the availability performance or resilience In this Let s Architect we explore cost optimization and ideas on … |
2022-12-07 16:53:00 |
AWS |
AWS Media Blog |
The Quortex approach to live streaming with Amazon EC2 Spot Instances |
https://aws.amazon.com/blogs/media/the-quortex-approach-to-live-streaming-with-amazon-ec2-spot-instances/
|
The Quortex approach to live streaming with Amazon EC Spot InstancesThis blog was co authored by Zavisa Bjelogrlic Senior Partner Solution Architect AWS Jérôme Viéron CTO Quortex and Marc Baillavoine CEO Quortex Introduction Amazon EC Spot Instances let you take advantage of unused EC capacity with a discount of up to a compared to on demand pricing nbsp Spot Instances can be reclaimed with a two minute notification which … |
2022-12-07 16:16:52 |
AWS |
AWS Government, Education, and Nonprofits Blog |
United Arab Emirates Space Agency and AWS sign agreement to support long-term growth in the region’s space ecosystem |
https://aws.amazon.com/blogs/publicsector/united-arab-emirates-space-agency-aws-sign-agreement-support-growth-regions-space-ecosystem/
|
United Arab Emirates Space Agency and AWS sign agreement to support long term growth in the region s space ecosystemThe United Arab Emirates Space Agency and AWS have signed a Statement of Strategic Intent and Cooperation that is designed to support the creation of a vibrant sustainable competitive and innovative space sector in the United Arab Emirates UAE AWS will collaborate with the United Arab Emirates Space Agency and related UAE government space organizations on three initiatives designed to support the Space Agency s long term development goals |
2022-12-07 16:50:00 |
python |
Pythonタグが付けられた新着投稿 - Qiita |
特徴量削減が精度向上に寄与することがあるのか実際に検証してみる |
https://qiita.com/takku321/items/bf27cc6f6ac1a7ecd1ce
|
twitter |
2022-12-08 01:13:55 |
python |
Pythonタグが付けられた新着投稿 - Qiita |
ROS2本を1日でやってみたお話し『ROS2とPythonで作って学ぶAIロボット入門 』 |
https://qiita.com/MeRT/items/aa3b6ebf6f4138cccb0e
|
日目 |
2022-12-08 01:09:53 |
python |
Pythonタグが付けられた新着投稿 - Qiita |
[Python] Cognitoで認証周りを作る |
https://qiita.com/yukiaprogramming/items/a6364596750c43c4c89e
|
defge |
2022-12-08 01:06:27 |
js |
JavaScriptタグが付けられた新着投稿 - Qiita |
Laravel8(API) × Vue.js で中間テーブルにPOSTする |
https://qiita.com/yonai_shun/items/c8b91c95316fa2809b86
|
insert |
2022-12-08 01:08:51 |
AWS |
AWSタグが付けられた新着投稿 - Qiita |
フルスタックエンジニアがAWS Certified Solutions Architect – Associateを取得した |
https://qiita.com/technoscape/items/2519b099686f110fadd2
|
sarchitectassociate |
2022-12-08 01:45:05 |
AWS |
AWSタグが付けられた新着投稿 - Qiita |
[Python] Cognitoで認証周りを作る |
https://qiita.com/yukiaprogramming/items/a6364596750c43c4c89e
|
defge |
2022-12-08 01:06:27 |
Docker |
dockerタグが付けられた新着投稿 - Qiita |
【Docker】docker-compose build実行時にload metadata for docker.io/library/○○とエラーが表示される |
https://qiita.com/so__hei__/items/46bac5698aa36fa456bb
|
docker |
2022-12-08 01:53:41 |
GCP |
gcpタグが付けられた新着投稿 - Qiita |
Cloud Bigtableと非構造化データについて |
https://qiita.com/ha_shio/items/ebec91d2c3e12371f605
|
bigtable |
2022-12-08 01:26:29 |
Git |
Gitタグが付けられた新着投稿 - Qiita |
年末なのでたまりに溜まったブランチを削除する |
https://qiita.com/TakehiroKATO/items/34c44b30956863b4c7b7
|
ampdesigneradventcalendar |
2022-12-08 01:48:17 |
海外TECH |
MakeUseOf |
Have Yeedi Clean the House for You This Holiday Season, Save Hundreds on Your Purchase |
https://www.makeuseof.com/yeedi-holiday-deal/
|
popular |
2022-12-07 16:31:16 |
海外TECH |
MakeUseOf |
How to Delete a Netlify App |
https://www.makeuseof.com/netlify-app-delete-how/
|
netlify |
2022-12-07 16:31:16 |
海外TECH |
MakeUseOf |
Does the Hyper-V Virtual Machine Show a Black Screen on Windows? Here's How to Fix It |
https://www.makeuseof.com/windows-hyper-v-black-screen/
|
windows |
2022-12-07 16:16:15 |
海外TECH |
MakeUseOf |
The December Security Patch for Android Fixes Critical Bluetooth Bug, and 80 More! |
https://www.makeuseof.com/android-december-security-patch-critical-bluetooth-bug/
|
android |
2022-12-07 16:03:43 |
海外TECH |
DEV Community |
Web resource caching: Server-side |
https://dev.to/apisix/web-resource-caching-server-side-4hm2
|
Web resource caching Server sideThe subject of Web resource caching is as old as the World Wide Web itself However I d like to offer an as exhaustive as possible catalog of how one can improve performance by caching Web resource caching can happen in two different places client side on the browser and server side In the previous post I explained the former this post focuses on the latter While client side caching works well it has one central issue to serve the resource locally it must first have it in the cache Thus each client needs its cached resource If the requested resource is intensive to compute it doesn t scale The idea behind server side caching is to compute the resource once and serve it from the cache to all clients A couple of dedicated server side resource caching solutions have emerged over the years Memcached Varnish Squid etc Other solutions are less focused on web resource caching and more generic e g Redis or Hazelcast If you want to dive deeper into generic caching solutions please check these two posts on the subject To continue with the sample from last week I ll use Apache APISIX to demo server side caching APISIX relies on the proxy cache plugin for caching Unfortunately at the moment APISIX doesn t integrate with any third party caching solution It offers two options memory based and disk based In general the former is faster but memory is expensive while the latter is slower but disk storage is cheap Within OpenResty however the disk option may be faster because of how LuaJIT handles memory You should probably start with the disk and if it s not fast enough mount dev shm Here are my new routes routes uri cache upstream id plugins proxy rewrite regex uri cache proxy cache Note that the default cache key is the host and the request URI which includes query parameters The default proxy cache configuration uses the default disk based configuration proxy cache Proxy Caching configuration cache ttl s The default caching time in disk if the upstream does not specify the cache time zones The parameters of a cache name disk cache one The name of the cache administrator can specify which cache to use by name in the admin api disk memory memory size m The size of shared memory it s used to store the cache index for disk strategy store cache content for memory strategy disk memory disk size G The size of disk it s used to store the cache data disk disk path tmp disk cache one The path to store the cache data disk cache levels The hierarchy levels of a cache disk name memory cache memory size mWe can test the setup with curl curl v localhost cacheThe response is interesting lt HTTP OK lt Content Type text html charset utf lt Content Length lt Connection keep alive lt Date Tue Nov GMT lt Last Modified Wed Nov GMT lt ETag ef lt Server APISIX lt Apisix Cache Status MISS lt Accept Ranges bytesBecause the cache is empty APISIX has a cache miss Hence the response is from the upstreamIf we curl again before the default cache expiration period seconds the response is from the cache lt HTTP OK lt Apisix Cache Status HITAfter the expiration period the response is from the upstream but the header is explicit lt HTTP OK lt Apisix Cache Status EXPIREDNote that we can explicitly purge the entire cache by using the custom PURGE HTTP method curl localhost cache X PURGEAfter purging the cache the above cycle starts anew Note that it s also possible to bypass the cache e g for testing purposes We can configure the plugin accordingly routes uri cache upstream id proxy cache cache bypass arg bypass Bypass the cache if you send a bypass query parameter with a non valuecurl v localhost cache bypass pleaseIt serves the resource from the upstream regardless of the cache status lt HTTP OK lt Apisix Cache Status BYPASSFor more details on all available configuration parameters check the proxy cache plugin ConclusionThis post was relatively straightforward The most challenging issue with server side caching is the configuration what to cache for how long etc Unfortunately it depends significantly on your context problems and available resources You probably need to apply PDCA guesstimate a relevant configuration apply it measure the performance and rinse and repeat until you find your sweet spot I hope that with an understanding of both client side and server side caching you ll be able to improve the performance of your applications The source code is available on GitHub ajavageek web caching To go further Cache API responsesproxy cache pluginOriginally published at A Java Geek on December th |
2022-12-07 16:37:00 |
海外TECH |
DEV Community |
What does "if __name__ == '__main__'" do in Python? |
https://dev.to/szabgab/what-does-if-name-main-do-in-python-4o11
|
What does quot if name x main x quot do in Python You might have seen the following code snippet in many Python files and wondered what does that do and why would you need it if name main In a nutshell it allows a file to be used both as a stand alone program script and as a module used by some other stand alone program Let s see the explanation Loading a file as a moduleA little background Let s see we have the following two files mylib pyprint In mylib myscript pyimport mylibprint In myscript If we runpython mylib pyit will printIn mylibThis is not surprising we told it to do just that If we runpython myscript pyit will printIn mylibIn myscriptThis is probably not what we wanted The print of the imported module was executed and it was executed before the print of our script Usually we don t expect anything to happen while we import modules Definitely nothing to be printed to the screen It happened because when Python imports a file a module it executes it at the time of import which means any code outside offunctions will be executed It is very rare that in module that we import there is any code outside of functions So the better approach to write a module would be this Having only functions in modulesmylib pydef display print In mylib def destroy harddisk myscript pyimport mylibprint In myscript myscript pyimport mylibprint In myscript mylib display Now we have two functions in our mylib py file One of them is the display function we would like to useand the the other is the destroy harddisk that you would probably not want to execute I have not even implementedit to make sure no one will run it and then come complaining If we run the first scriptpython myscript pywe only seeIn myscriptThis is not surprising as now even though we imported the mylib module we did not call any of its functions n order to see the text from the display function we need to call it If we run the second scriptpython myscript pywe seeIn myscriptIn mylibThis time the content of mylib py display function is printed at the time when it is called However if we now runpython mylib pyThere is no out So in order to facilitate the needs of the scripts that import the mylib py we changed the behavior of the mylib pyand ruined it What we would really like is to see the output of the display function when execute python mylib py Make both cases workThis is where the expression comes in handy mylib pydef display print In mylib def destroy harddisk if name main display myscript pyimport mylibprint In myscript myscript pyimport mylibprint In myscript mylib display Now if we run python mylib pyWe getIn mylibjust as we got in the first case and if we run python myscript py or python myscript py that will act as that did in the second case So now we have a file mylib py that can be used both as a module and as a stand alone program script How does name main work Let s take the first example and make it print the content of the variable name mylib pyprint f In mylib name name myscipt pyimport mylibprint f In myscript name name Running mylib python mylib pyWe get In mylib name main So when you run the file as a stand alone program the variable name contains the strange string main What if we run the program script python myscript pyWe get the following In mylib name mylib In myscript name main Now we have two different variables called name One of them in the file mylib py There it contains myliband one in myscript py that contains main Our focus should be the fact that the content of the variable in mylib py depends on how it was used When we used itas a stand alone script it had main in it and when we imported it as a module then it had mylib the name of the file without the extension in it Now if you go back to the rd example to the code we are trying to understand that you can also see here if name main The condition checks the content of the name variable and calls the display functiononly if it contain main meaning that the file was used as a stand alone program When to use it Now that we know what does it do and how does it work let s ask the question when to use it The answer is that it is actually rarely needed It can be used to allow a module to be self testing so we would run the tests of the file when it is executed as a stand alone program However these days it is much more common to put our tests in separate files and even if we include documents that need to be verified usingdoctest pytest can execute those without this construct It can be used to make a module also a stand alone script but it seems like a back engineering practice that will backfire However it must be used when you use multiprocessing on Windows because that module works by loading your main script as a module There might be some other cases as well when it is useful or even required but I think in general it is needed a lot less than it is actually used |
2022-12-07 16:30:00 |
海外TECH |
DEV Community |
My Journey for finding a new job (and a new country/home) |
https://dev.to/djdiox/my-journey-for-finding-a-new-job-and-a-new-countryhome-27po
|
My Journey for finding a new job and a new country home Hi dev to Community This is my first post for dev and I am really excited about this step if this will work now I am and work now for around Years as Full Stack Developer Lately specified on consulting real big German companies I was already travelling a lot especially around South East Asia and I actually fell in the love with these beautiful companies and the people there So this plan came in my head Since we Dev s are really gifted in sense of we can live work nearly from everywhere and if there is internet and some economy as well is enough for us to work there Since I really want this to happen I made up some plans so I hopefully can live soon underneath palms and writing my cloud software Now it s getting Concrete and I am really looking forward to it finally fulfil my dream maybe temporary or even live forever in Wish me good luck Dev are united all over the globeHere is my Resume CE Blog is currently under construction soon blog markuswagner devNotion Portfolio doneHahaha before I forget LinkedInGreat Regards Markus |
2022-12-07 16:28:25 |
海外TECH |
DEV Community |
The Fastest Way to Run Mastodon Tests |
https://dev.to/appmap/the-fastest-way-to-run-mastodon-tests-5g03
|
The Fastest Way to Run Mastodon TestsIn the last post we showed you how to use Dev Containers and AppMap to onboard as a new developer for the open source federated social media application Mastodon We showed how AppMap can reveal the internal behavior of the application at runtime But if you want to start contributing improvements to Mastodon you ll need to run the rspec test suite to ensure nothing breaks along the way In this post we ll show you How to get your local environment setup for running the Mastodon rspec tests How to see each AppMap that will be created for each test caseHow AppMap will identify which tests need to be re ran after you make your code changes If you haven t yet read Part of getting started head back and follow the steps to get a working Mastodon development environment deployed locally using Docker and VS Code Dev Containers You can also clone Mastodon from the AppMap GitHub repo with all the configuration completed for you Complete the Configuration needed for running testsWhile not listed in the documentation for the Mastodon project you will need to run a webpacker asset precompile command to ensure the rails application has the necessary files when the tests are executed To use the correctly configured gems we ll be running our tests with RAILS ENV test Make sure your environment has this variable set as well When the environment variable RAILS ENV test is set Mastodon doesn t support running the webpack commands We need to make a small change to the config webpacker yml file to support running these tests locally These changes are already in the appmap branch on the AppMap GitHub repo test lt lt default compile true dev server compress trueRAILS ENV test bin webpackNow your environment should be fully set up to run the rspec tests Run Rspec on a single testBefore kicking off a run of the entire test suite especially one with so many tests as Mastodon I like to run a single rspec test to make sure our environment is set up correctly RAILS ENV test bundle exec rspec spec controllers settings applications controller spec rb gt Time Finished in seconds files took seconds to load examples failuresSuccess Now with that test executed there were examples that succeeded and we get a single AppMap for each of those examples Run all the rspec testsNow that we have one set of tests working correctly let s run the entire test suite RAILS ENV test bundle exec rspecWhile those tests run you ll start to see a slew of new AppMaps in the VS Code editor one map per test case executed You can leave the rspec tests running in the background while you navigate the AppMaps This will be the only time you need to actually run all of the rspec tests at one time later on in this post you ll learn how to re run only the tests that are affected by your changes in the codebase Using AppMap to find untested codeAdditionally with all the tests executed AppMap will pin each function in your source code to show you which AppMaps this code was executed within If you see a function missing a pin that simply means AppMap didn t see it executed and doesn t have its data in one of the AppMaps If you were to have test coverage congrats that would mean that any code without a pin could technically be dead code that is no longer used But in this scenario it s far more likely that when finding a function lacking an AppMap annotation you are looking at areas of the codebase that are missing tests or otherwise have not been recorded by AppMap Save hours re running testsNow that AppMaps have been created for large segments of our code base AppMap can use these to identify exactly which tests need re running after we make adjustments to the code base Instead of waiting for thousands of rspec tests to run after we make a small change we can use AppMap s “up to date and Guard to identify the tests that our code changes impacted and re run only those Since running all of the Mastodon tests can take minutes you can save this time by having AppMap identify only the specific tests impacted by your change and run only those tests Over the course of a few changes and commits you can easily save hours waiting for tests to run There are ways we can identify and re run out of date tests Identify Out of Date Tests and execute with rspecFor an easy test we can add a new log line to a function in the “Feed model By making this change simple we ll be able to see exactly how many AppMaps will become out of date and then query AppMap for the list of tests we d need to rerun to bring them back up to date We ll add a single line to our feed model to debug log out the value of limitRails logger debug limitAfter saving that change we ll get a notification at the bottom of our VS Code window which shows that AppMaps are now out of date Looking at the AppMaps in the left column you ll see they ll be marked as “Out of Date We can now open the Command Palette in VS Code View gt Command Palette OR Shift ⌘ P on Mac OR Ctrl Shift P on Windows Search for “out of date in the command picker and select “Copy Out of Date Tests to Clipboard You ll get a popup letting you choose what format you want to copy them in We ll pick just the file names This is the list we ll pass to rspec Now we can just re run rspec and paste in our tests bundle exec rspec spec controllers api v timelines home controller spec rb spec controllers api v timelines list controller spec rb spec models home feed spec rb spec services batched remove status service spec rb spec services fan out on write service spec rbAnd the tests will finish in about a minute a fraction of the time needed to run the full test suite lt snip gt gt Time Finished in minute seconds files took seconds to load examples failures Use Guard to run out of date tests Guard is a Ruby gem that can be used to run test cases automatically when source files change For example with guard minitestand a simple Guardfile you can run tests as the files are modified If you are working from the AppMap forked version of Mastodon in the AppMap repo the Gemfile and Guardfile will already have these settings configured Add the following to the gem file in the test group gem guard gem guard rspecThen create a Guardfile with the following guard rspec cmd RAILS ENV test bundle exec rspec do watch r spec rb endStart guard with bundle exec guard then as you code run AppMap Touch Out of Date Test Files AppMap will touch those test files updating the modification date on only the ones that need to be re ranYou can see below how that works in practice as I add a new line to this codebase and save the file AppMap will identify that there are out of date tests I then run “Touch Out of Date Test Files and Guard will see those files change and automatically run only the necessary rspec tests SummaryIn this post we showed you How to prepare your local environment to support running the Mastodon test suite How to run the Mastodon test cases locally with rspecHow to use AppMap to save hours of development time identifying the specific test cases to run based on new code changes Use tools like Guard to automate and simplify re running out of date tests when you make changes to the code base If you are interested in learning more you can clone the Mastodon project in the AppMap repository You can add AppMap to your VS Code or JetBrains code editor And finally you can join the conversation and chat with other AppMap engineers and users by joining the AppMap community Slack |
2022-12-07 16:26:03 |
海外TECH |
DEV Community |
Watch us build a *truly* full-stack app in just 9 minutes w/ Wasp & ChatGPT 🚀 🤯 |
https://dev.to/wasp/watch-us-build-a-truly-full-stack-app-in-just-9-minutes-w-wasp-chatgpt-iee
|
Watch us build a truly full stack app in just minutes w Wasp amp ChatGPT There s a lot of hype around ChatGPT at the moment and for good reason It s amazing But there s also some very valid criticism that it s simply taking the grunt work out of programming by writing boilerplate for us which we as developers have to maintain PG is totally right in his remark above but what he doesn t realize is that there are languages out there that attempt to overcome this very problem and Wasp is one of them What makes Wasp unique is that it s a framework that uses a super simple language to help you build your web app front end server and deployment But it s not a complicated language like Java or Python it s more similar to SQL or JSON so the learning curve is really quick technically it s a Domain Specific Langauge or DSL Check it out for yourself main wasp app todoApp title ToDo App visible in tab auth full stack auth out of the box userEntity User externalAuthEntity SocialLogin methods usernameAndPassword google route RootRoute path to MainPage page MainPage import your React code component import Main from client Main js With this simple file above Wasp will continually compile a truly full stack web app for you with a React front end and an ExpressJS server You re free to then build out the important features yourself with React NodeJS Prisma and react query The great part is you can probably understand the Wasp syntax without even referencing the docs Which means AI can probably work with it easily as well So rather than having AI create a ton of boilerplate for us we thought “can ChatGPT write Wasp If it can all we need is to have it create that one file and then the power of Wasp will take care of the rest No more endless boilerplate So that s exactly what we set to find out in the video above The results Well let s just say they speak for themselves |
2022-12-07 16:14:37 |
Apple |
AppleInsider - Frontpage News |
20-inch foldable MacBook rumored to launch as soon as 2026 |
https://appleinsider.com/articles/22/12/07/20-inch-foldable-macbook-rumored-to-launch-as-soon-as-2026?utm_medium=rss
|
inch foldable MacBook rumored to launch as soon as Rumors from the supply chain suggest that Apple will roll out OLED not just to the iPad but also to a folding MacBook Pro as soon as A folding MacBook could be inches unfoldedRumors surrounding a folding Apple product have been circulating since Samsung launched its first foldable Even as most expect an iPhone Fold sometime in the near future some rumors suggest Apple is looking to larger folding displays too Read more |
2022-12-07 16:54:13 |
Apple |
AppleInsider - Frontpage News |
Apple's M1 Max MacBook Pro 14-inch with 32GB RAM dips to $2,445, plus $70 off AppleCare |
https://appleinsider.com/articles/22/12/07/apples-m1-max-macbook-pro-14-inch-with-32gb-ram-dips-to-2445-plus-70-off-applecare?utm_medium=rss
|
Apple x s M Max MacBook Pro inch with GB RAM dips to plus off AppleCareSave on the high end inch MacBook Pro configuration with Apple s M Max chip and GB of memory with units in stock and ready to ship with free expedited delivery This inch MacBook Pro is loaded Apple s MacBook Pro inch with the M Max chip is a powerful and versatile laptop that offers impressive performance and a wide range of features Equipped with a core CPU and core GPU this MacBook Pro is packed with upgrades making it perfect for handling demanding tasks like video editing D rendering and image creation Read more |
2022-12-07 16:54:04 |
Apple |
AppleInsider - Frontpage News |
French environment group files complaint over iPhone repair practices |
https://appleinsider.com/articles/22/12/07/french-environment-group-files-complaint-over-iphone-repair-practices?utm_medium=rss
|
French environment group files complaint over iPhone repair practicesA French environmental group has filed a complaint against Apple because the company restricts the use of unauthorized parts in iPhone repairs iPhone repairThe complaint targets a practice Apple uses for its devices called pairing It associates the serial numbers of components with a specific iPhone to make sure the repair parts are genuine Read more |
2022-12-07 16:55:23 |
Apple |
AppleInsider - Frontpage News |
Apple will ship 9 million fewer iPhones in Q4 than expected |
https://appleinsider.com/articles/22/12/07/apple-will-ship-9-million-fewer-iphones-in-q4-than-expected?utm_medium=rss
|
Apple will ship million fewer iPhones in Q than expectedIncreased uncertainty in China around iPhone Pro manufacturing has caused Morgan Stanley to drop its iPhone shipment prediction by another million units in Q iPhone Pro impacted by Foxconn protestsThe zero Covid policy in China has led to protests and riots at Foxconn where iPhones are manufactured Despite Foxconn s efforts employees have quit in droves and new hiring hasn t offset the losses Read more |
2022-12-07 16:00:37 |
海外TECH |
Engadget |
Congress axes media revenue sharing bill after pushback from Google and Meta |
https://www.engadget.com/jcpa-media-sharing-bill-dropped-in-congress-165414746.html?src=rss
|
Congress axes media revenue sharing bill after pushback from Google and MetaA US government attempt to compensate publishers for web links has fallen apart as Congress has cut the Journalism Competition and Preservation Act JCPA from the annual national defense spending bill The measure would have made temporary exceptions to antitrust law letting media outlets negotiate revenue sharing deals such as receiving a cut of ad money from links to news articles in search results and social media posts The removal comes after extensive resistance from tech firms Just this week Facebook owner Meta warned it would quot consider removing news quot from its platform rather than submit to government required negotiations for revenue sharing deals As with the social media giant s objections to similar legislative efforts in Australia and Canada the company argued that the JCPA would force companies to pay for content whether or not they wanted to see it This would supposedly create a quot cartel like entity quot that made one company subsidize others Two industry groups the Computer amp Communications Industry Association and NetChoice also said they would launch extensive ad campaigns to oppose the JCPA Both groups include major tech companies like Amazon Google and Meta Google has been a vocal opponent of link revenue shares in the past and only reluctantly agreed to them in countries like France Advocacy groups have taken more varied stances Public Knowledge and its allies were concerned tech companies could be forced to carry extreme content and that the JCPA favored larger media producers over small publishers Political critics across the spectrum meanwhile have worried that the Act could alternately strip away moderation tools or fuel biased reporting It s not certain what will happen to the efforts behind the JCPA Lead proponent Sen Amy Klobuchar said politicians quot must quot find a way to improve compensation for news However it s safe to say the media companies that supported the bill won t be happy The Los Angeles Times Fox News owner News Corp and others had argued that the would be law was necessary to counter years of declining ad revenue in the shift toward online news coverage For now at least they won t have that potential help |
2022-12-07 16:54:14 |
海外科学 |
NYT > Science |
Oldest Known DNA Paints Picture of a Once-Lush Arctic |
https://www.nytimes.com/2022/12/07/science/oldest-dna-greenland-species.html
|
Oldest Known DNA Paints Picture of a Once Lush ArcticIn Greenland s permafrost scientists discovered two million year old genetic material from scores of plant and animal species including mastodons geese lemmings and ants |
2022-12-07 16:14:21 |
海外TECH |
WIRED |
San Francisco Just Reversed Its Killer Robot Plan |
https://www.wired.com/story/san-francisco-police-killer-robots-ban/
|
San Francisco Just Reversed Its Killer Robot PlanThe city s board of supervisors has rolled back a controversial decision to let robots use lethal force without human intervention But the fight is far from over |
2022-12-07 16:31:00 |
金融 |
金融庁ホームページ |
「金融商品債務引受業の対象取引から除かれる取引及び貸借を指定する件」の一部改正(案)について公表しました。 |
https://www.fsa.go.jp/news/r4/shouken/20221207/20221207.html
|
金融商品 |
2022-12-07 17:00:00 |
金融 |
金融庁ホームページ |
企業会計審議会第24回内部統制部会 議事次第について公表しました。 |
https://www.fsa.go.jp/singi/singi_kigyou/siryou/naibu/20221208.html
|
企業会計 |
2022-12-07 17:00:00 |
金融 |
金融庁ホームページ |
貸金業関係資料集を更新しました。 |
https://www.fsa.go.jp/status/kasikin/20221207/index.html
|
関係 |
2022-12-07 17:00:00 |
金融 |
金融庁ホームページ |
英OMFIF主催暗号資産ラウンドテーブルでの講演「Regulating the crypto assets landscape in Japan」について公表しました。 |
https://www.fsa.go.jp/inter/etc/20221207/20221207.html
|
omfif |
2022-12-07 17:00:00 |
ニュース |
@日本経済新聞 電子版 |
飛び恥で短距離便禁止「無意味」、航空業界トップが批判
https://t.co/KdwN2hEERl |
https://twitter.com/nikkei/statuses/1600522461132443648
|
航空 |
2022-12-07 16:07:45 |
ニュース |
BBC News - Home |
Border Force staff at airports to strike over Christmas |
https://www.bbc.co.uk/news/uk-politics-63893115?at_medium=RSS&at_campaign=KARANGA
|
border |
2022-12-07 16:40:37 |
ニュース |
BBC News - Home |
Matt Hancock to stand down at next election |
https://www.bbc.co.uk/news/uk-politics-63891100?at_medium=RSS&at_campaign=KARANGA
|
celebrity |
2022-12-07 16:27:54 |
ニュース |
BBC News - Home |
Germany arrests 25 accused of plotting coup |
https://www.bbc.co.uk/news/world-europe-63885028?at_medium=RSS&at_campaign=KARANGA
|
parliament |
2022-12-07 16:08:26 |
ニュース |
BBC News - Home |
Volodymyr Zelensky is Time Magazine's 2022 Person of the Year |
https://www.bbc.co.uk/news/world-63890775?at_medium=RSS&at_campaign=KARANGA
|
editor |
2022-12-07 16:13:38 |
ニュース |
BBC News - Home |
David Fuller: Double murderer sentenced over more mortuary sex abuse |
https://www.bbc.co.uk/news/uk-england-kent-63876517?at_medium=RSS&at_campaign=KARANGA
|
woman |
2022-12-07 16:10:39 |
ニュース |
BBC News - Home |
Taliban conduct first public execution since return to power |
https://www.bbc.co.uk/news/world-asia-63884696?at_medium=RSS&at_campaign=KARANGA
|
confesses |
2022-12-07 16:49:27 |
ニュース |
BBC News - Home |
How do you keep babies safe in the cold? And other questions |
https://www.bbc.co.uk/news/uk-63888234?at_medium=RSS&at_campaign=KARANGA
|
costs |
2022-12-07 16:52:38 |
ニュース |
BBC News - Home |
December strikes: Who is striking and what are their pay claims? |
https://www.bbc.co.uk/news/business-62134314?at_medium=RSS&at_campaign=KARANGA
|
disruption |
2022-12-07 16:02:10 |
ニュース |
BBC News - Home |
World Cup 2022: England v France - Kyle Walker 'will not roll out red carpet' for Kylian Mbappe |
https://www.bbc.co.uk/sport/football/63891285?at_medium=RSS&at_campaign=KARANGA
|
World Cup England v France Kyle Walker x will not roll out red carpet x for Kylian MbappeKyle Walker says he will not roll out a red carpet for Kylian Mbappe when England meet France in their World Cup quarter final |
2022-12-07 16:32:12 |
GCP |
Cloud Blog |
How partners can maximize their 2023 opportunity with the transformation cloud |
https://cloud.google.com/blog/topics/partners/maximize-our-shared-journey-with-the-transformation-cloud/
|
How partners can maximize their opportunity with the transformation cloudThe excitement at Next this year was inescapable We celebrated a number of exciting announcements and wins that show where the cloud is heading and what that means for our partners and customers As we close out and finalize our plans for I wanted to provide a perspective on the most important partner developments from the event to help you hit the ground running next year Google Cloud s transformation cloud was front and center throughout our entire event This powerful technology platform is designed to accelerate digital transformation for any organization by bringing five business critical capabilities to our shared customers The ability to build open data clouds to derive insights and intelligence from data Open infrastructure that enables customers to run applications and store data where it makes the most sense A culture of collaboration built on Google Workspace that brings people together to connect and create from anywhere enabling teams to achieve more The same trusted environment that Google uses to secure systems data apps and customers from fraudulent activity spam and abuse And a foundational platform that uses efficient technology and innovation to drive cost savings and create a more sustainable future for everyone More than just vision the transformation cloud is delivering results today British fashion retailer Mulberry and partner Datatonic have built data clouds to drive a increase in online sales Vodafone in EMEA is working with our partner Accenture to migrate and modernize its entire infrastructure Hackensack Meridian Health in New Jersey is working with partner Citrix to leverage our infrastructure and Google Workspace to modernize its systems enable collaboration reduce costs bolster security and provide better patient and practitioner experiences Many more transformation stories are available here and in our partner directory For our partners the transformation cloud is your customer satisfaction engine It enables you to bring new capabilities to market that customers cannot get anywhere else from overcoming challenges around organizational management to demand forecasting supply chain visibility and more all of this is possible only with the capability of our Data AI ML collaboration and security tools Thomas Kurian and Kevin Ichhpurani provided excellent insight and guidance for partners looking to begin or accelerate their journey with the transformation cloud in their Next partner keynote Briefly here are the three steps partners can take now to set themselves up for success in Customers expect you to be deeply specialized in cloud solutions and their business Customers have made it clear they expect to work with partners who are deeply knowledgeable about the technology solutions and foundational elements of the transformation cloud Just as important it s no longer good enough for partners to offer a small group of highly trained individuals to do it all Customers need deep cloud expertise within specific business functions and even within global regions They need people who know how to leverage our cloud solutions to achieve great outcomes for finance departments human resources customer service operations and more And more than that customers need people who are also experts at driving these kinds of transformations within regional environments defined by unique policies compliance requirements and even cultural issues This is a tall order but it s absolutely critical to your growth and success This is why Google Cloud is investing in the tools training and support you need to expand your bench of trained and certified individuals Second increase your focus on consumption and service delivery to land and expand opportunities The demand here is significant and growing In its Global IT Market Outlook analyst firm Canalys stated that partner delivered IT products and services will account for more than of the total global IT market this year and into next year about even with its forecast which suggests that services remain in high demand This includes managed services such as cloud infrastructure and software services managed databases managed data warehouses managed analytic tools and more These are high margin endeavors for partners Equally important these kinds of services allow your customers to shift their people from managing technology to managing and growing the business As Thomas Kurian said during his Next remarks Google Cloud is not in the services business that s the domain of our partners We are a product and technology company This is why we have a partner led service delivery commitment and a goal of bring partners into of customer engagements Third we are investing to help Google Cloud partners drive consumption and new business We know you are focused on growing your customer engagements and accelerating customers time to value We re here to support you Our Smart Analytics platform is a key market differentiator that enables partners to tap into the fast growing Data amp Analytics market which is expected to hit B by We are investing billion in cybersecurity and our recent acquisition of Mandiant extends our leadership in this area by combining offense and defense in powerful new ways Governments worldwide are looking for sovereign cloud solutions to meet their security privacy and digital sovereignty requirements Google Cloud has a highly differentiated solution in this area and partnerships are critical We are driving to validate all of our ISV partner solutions through our Cloud Ready Sovereign Solutions initiative We are providing increasing resources and support to help partners embed the capabilities of Google Workspace in their solutions We continue to allow customers to buy partner solutions and decrement their commits just like with Google Cloud products You ll see more from us on all of this in kick off The opportunity to prosper Google partners and customers alike is tremendous I ve never been more excited about the year ahead IDC Forecast Companies to spend B on AI Solutions in Related ArticleWhat s next for digital transformation in the cloudGoogle Cloud Next is here Check out the official kickoff blog and hear from our CEO Thomas Kurian on new customer wins partnershi Read Article |
2022-12-07 17:00:00 |
GCP |
Cloud Blog |
HEDIS on FHIR to improve quality of patient care |
https://cloud.google.com/blog/topics/healthcare-life-sciences/hedis-on-fhir/
|
HEDIS on FHIR to improve quality of patient careIf you re familiar with the Medicare Medicaid market you ve probably heard of the HEDIS score the Health Effectiveness Data and Information Set It s a standard created by the National Committee for Quality Assurance NCQA and scores the quality of Medicare and Medicaid plans by measuring the effectiveness and utilization of plan benefits access to care and services delivered by providers Not only it drives reimbursement for providers and health plans but it also impacts the attractiveness of the insurance plans to the prospective members while they enroll in the health plan To say that it s significant would be an understatement What does the HEDIS score mean for healthcare providers and payers Increased physician collaboration and patient engagement to identify and close gaps in care This puts tremendous focus on the need for data sharing and interoperability between the provider and the payer Payers have responsibility to help identify gaps in care and deliver those insights to providers so that patients can receive the highest quality care possible The catch to all of this Patients spend limited time in exam rooms and facilities in front of physicians In those short engagements how well and how quickly the payer can deliver insights and opportunities to physicians makes all the difference in terms of raising quality of patient care and subsequently HEDIS scores The ability to quickly and effectively exchange analyze and summarize data and information between payer and provider is critical Using data to improve quality of careFor many of us an on going relationship and history with our primary care physician can certainly help ensure more personalized care When it comes to care gaps however it s not just a matter of knowing a patient s history Care gaps are the result of comparing care and services rendered the results of screenings and tests and treatment recommendations based upon the patient s historical claims data and health records During a typical visit finding comparing and summarizing all of this information quickly and turning that into personalized care gap recommendations can be challenging Fortunately Google Cloud Exafluence EXF and MongoDB have been working together to make identifying and closing care gaps far more efficient effective and easier than ever before Providers can enjoy a seamless user experience in SMART on FHIR where clinical records turn into actionable care gaps and recommendations helping them provide better care in less time with seamless and secure data exchange with payers With the SMART on FHIR application on Google Cloud providers can safely and securely retrieve a patient s clinical records from the Healthcare Data Engine HDE during patient office visits Providers can immediately request care gap recommendations from payers by sharing critical data and insights from patient health records with payers Payers can receive care gap requests from providers and leverage an enriched data set combining patient health records plan benefits and treatment pathways to identify care gaps in real time and send those back to the provider before the patient leaves the exam room Providers can deliver personalized and insightful care to patients by incorporating care gap recommendations helping both increase quality of care and raise HEDIS scores Providers can interact in real time with Payers to exchange information and receive personalized care gaps for patients and populationsOur approach to computing care gaps and how we re making them available to the clinicians at the point of care Identify data sources for computing the care gapsWhen care gap recommendations are derived using historical claims data only they are not as accurate and timely In addition to the historical claims data access to the patient s current clinical data is key to computing the precise care gap recommendations Providers and health systems use Google Cloud s Healthcare Data Engine HDE to aggregate clinical data from disparate healthcare systems and create longitudinal patient records Data from HDE is accessible in a standard compliant HL FHIR format through well defined REST APIs Many health plans have implemented MongoDB Atlas to store and manage the claims coverage and membership data The EXF Care Gap service leverages the health plan data from the MongoDB Atlas database to compute care gap recommendations based on the HDE hosted clinical data Define and implement rules to calculate care gapsThe EXF Care Gap service calculates the care gap recommendations in near real time and stores them in the MongoDB Atlas database as FHIR resources The CareGap service calculates care gap recommendations by applying configurable rules against the aggregated dataset stored in the MongoDB Atlas database and the clinical data sourced from HDE Serve care gaps through REST API and SMART on FHIR AppClinical systems used at the point of care will launch a SMART on FHIR compliant App which retrieves the care gap recommendations from the EXF FHIR Server and displays them to the clinicians The App fetches the patient s current clinical information from HDE through Apigee HealthAPIx and submits it to the EXF FHIR Server to retrieve the care gap recommendations from the Care Gap service in FHIR format How Google Cloud can help compute care gap recommendationsApigee HealthAPIx and SMART on FHIRApigee HealthAPIx is an open source solution to accelerate interoperability in Healthcare through standards like FHIR Fast Healthcare Interoperability Resources and SMART SMART Substitutable Medical Applications and Reusable Technologies It runs on Apigee Edge with Google Cloud sHDE as a back end The primary purpose of the HealthAPIx solution is to enable the API Program in healthcare organizations and provide API level access to healthcare data HealthAPIx solution comes with ready made interoperability APIs to meet theInteroperability and Patient Access final rule CMS F published by theCMS It enables Payers Providers and technology vendors to meet the regulatory and compliance requirements defined under thest Century Cures Act finalized by the Office of the National Coordinator for Health Information Technology ONC Department of Health and Human Services HHS With Apigee HealthAPIx healthcare providers and health systems can Manage secure and scale APIs with an enterprise grade platform that is FHIR server agnosticQuickly close innovation gaps and keep up with digital advancesEasily ingest healthcare data from internal external or open source FHIR ready partnersGoogle Cloud Healthcare Data EngineHDE aggregates clinical data from disparate health systems to create an interoperable longitudinal patient record HDE can ingest and map data from HLv messages and CSV and FHIR formatted flat files It supports data ingestion in batches and near real time HDE harmonizes and reconciles the ingested data into FHIR R format flattens it and stores it in BigQuery Patients longitudinal health records in HDE are accessible through REST API with FHIR formatted payload for transactional processing The same data is also accessible from BigQuery through SQL friendly interfaces for analytical processing HDE incorporates a cloud configuration for healthcare that extends Google Cloud s security and privacy with support for HIPAA and HITRUST best practices HDE leverages Google Cloud s highly scalable and secure HIPAA compliant managed services like Google s Cloud Healthcare API and BigQuery EXF FHIR with EXF proxy in Apigee XWhen it comes to managing securing and scaling your FHIR API an enterprise grade platform that is FHIR agnostic is more than recommended it s essential A FHIR server in its most primitive format handles inbound requests and processes retrieves and exchanges data over REST based API s using standardized FHIR Resources to and from underlying data sources They have the responsibility of doing the most low level data processing for data exchange and data format standardization and keep us compliant with FHIR mandates When it comes to delivering your FHIR API s to the outside world security scale and performance are mandatory An API gateway is essential and sits in between your backend FHIR API and the external clients that interact with them Google s Apigee X makes it easy for healthcare IT delivery teams to safely securely and efficiently deploy and manage your FHIR API helping you go to market faster with production grade solutions and API based digital services The EXF FHIR server offers unparalleled flexibility with the ability of giving you the choice of pairing your FHIR APIs with new or existing MongoDB deployments All flavors of MongoDB are supported from Community Server Enterprise Advanced and MongoDB Atlas Having the flexibility to deploy in the cloud or on prem or in hybrid mode also provides configuration for whatever requirements your organization must meet By leveraging new or existing MongoDB deployments the EXF FHIR server helps you reduce complexity and more quickly integrate with existing organization datasets and sources EXF Care Gap ServiceCombining enriching and aggregating data from the provider s FHIR resources along with the existing plan benefit and treatment data the EXF Care Gap service can deliver actionable care gap insights in real time It can also map and transform existing data into FHIR native format helping healthcare organizations bring FHIR based modern applications to market from existing legacy sources The Exafluence and MongoDB teams can work with the healthcare organizations to define plan and execute the implementation of the Care Gap service MongoDB for FHIRMongoDB is a Document based database which implements the JSON based Document Model and as such is a natural fit to store all of your FHIR Resources Your FHIR Resources can be stored in native format within MongoDB with no modification needed to the schema definitions This helps accelerate application delivery simplifies your overall data architecture strategy and allows FHIR to serve as not only the data model supporting your API s but the new canonical model that your organization can migrate to The extensibility and flexibility of the Document Model also allows you to extend your FHIR Resources allowing you to leverage native FHIR and add the information needed to support your organization s unique requirements Unlike many COTS FHIR solutions leveraging MongoDB as your underlying datastore also allows you to reduce the number of databases needed in your tech stack Typically a FHIR server will ship with an underlying database and a relational schema that differs from the FHIR schema These COTS solutions often don t allow you to change the underlying database platform and even worse do not give you access to the underlying database directly At MongoDB we believe healthcare organizations should be able to do more with FHIR collections than simply serving the external FHIR API With direct access to the underlying database and the MongoDB Document Model organizations can build a new Operational Data Layer that helps modernize to Domain Driven Design principles serve FHIR API needs and helps build any new applications or services needed from one common data layer To learn more about using MongoDB with FHIR and as an Operational Data Layer click here to read more about implementing an Operational Data Layer Building a more seamless interoperable future The approach to make care gaps accessible to clinicians at the point of care is an example of how Health Plans and Providers can exchange important healthcare information in real time using interoperability standards like FHIR and SMART on FHIR regardless of the underlying technology used to implement the specifications In this instance the providers and health systems leverage Google s FHIR implementation in HDE and HealthAPIx while the Health Plans leverage FHIR implementation from EXF and MongoDB for exchanging key healthcare information in real time The solution outlined in this blog post helps providers and health plans to receive better scores and star ratings from the government with significant medicare financial incentives and it enables providers to deliver personalized and contextual care for better patient outcomes We look forward to partnering with healthcare organizations to build a standards compliant and interoperable foundation for seamless prompt and secure information exchange between providers and payers ReferencesBuilding FHIR Applications with MongoDB AtlasLearn more about FHIR on MongoDBUsing Apigee X with the Cloud Healthcare API Cloud Architecture CenterGitHub cqframework cql execution A JavaScript framework for executing CQLGitHub cqframework cql exec fhir A FHIR data source for the JavaScript CQL Execution projectCare Gaps AmeriHealth Caritas DelawareHEDIS Medicare Health Outcomes Survey NCQAIntroducing NCQA on FHIRBeneficiary Claims Data APIGitHub google medical claims tools Examples libraries and tools for working with bulk FHIR data particularly FHIR BCDA Claims at the moment Medicare Advantage and Part D Star Ratings |
2022-12-07 17:00:00 |
GCP |
Cloud Blog |
The business value of Cloud SQL: how companies speed up deployments, lower costs and boost agility |
https://cloud.google.com/blog/products/databases/the-business-value-of-cloud-sql/
|
The business value of Cloud SQL how companies speed up deployments lower costs and boost agilityIf you re self managing relational databases such as MySQL PostgreSQL or SQL Server you may be thinking about the pros and cons of cloud based database services Regardless of whether you re running your databases on premises or in the cloud self managed databases can be inefficient and expensive requiring significant effort around patching hardware maintenance backups and tuning Are managed database services a better option To answer this question Google Cloud sponsored a business value white paper by IDC based on the real life experiences of eight Cloud SQL customers Cloud SQL is an easy to use fully managed database service for running MySQL PostgreSQL and SQL Server workloads More than of the top Google Cloud customers use Cloud SQL The study found that migration to Cloud SQL unlocked significant efficiencies and cost reductions for these customers Let s take a look at the key benefits in this infographic Infographic IDC business value study highlights the business benefits of migrating to Cloud SQLA deeper dive into Cloud SQL benefitsTo read the full IDC white paper you can download it here The Business Value of Cloud SQL Google Cloud s Relational Database Service for MySQL PostgreSQL and SQL Server by Carl W Olofson Research Vice President Data Management Software IDC and Matthew Marden Research Vice President Business Value Strategy Practice IDC Looking for commentary from IDC Listen to the on demand webinar How Enterprises Have Achieved Greater Efficiency and Improved Business Performance using Google Cloud SQL where Carl Olofson discusses the downsides of self managed databases and the benefits of managed services like Cloud SQL including the cost savings and improved business performance realized by the customers interviewed in the study Getting startedYou can use our Database Migration Service for an easy secure migration to Cloud SQL Since Cloud SQL supports the same database versions extensions and configuration flags as your existing MySQL PostgreSQL and SQL Server instances a simple lift and shift migration is usually all you need So let Google Cloud take routine database administration tasks off your hands and enjoy the scalability reliability and openness that the cloud has to offer Start your journey with a Cloud SQL free trial Related ArticleWhat s new in Google Cloud databases More unified More open More intelligent Google Cloud databases deliver an integrated experience support legacy migrations leverage AI and ML and provide developers world class Read Article |
2022-12-07 17:00:00 |
コメント
コメントを投稿