投稿時間:2021-08-03 03:27:06 RSSフィード2021-08-03 03:00 分まとめ(29件)

カテゴリー等 サイト名等 記事タイトル・トレンドワード等 リンクURL 頻出ワード・要約等/検索ボリューム 登録日
IT 気になる、記になる… Microsoft、初代「Surface Book」向けに2021年7月度のファームウェアアップデートをリリース https://taisy0.com/2021/08/03/143728.html microsoft 2021-08-02 17:08:29
IT 気になる、記になる… Microsoft、「Surface Pro 4」向けに2021年7月度のファームウェアアップデートをリリース https://taisy0.com/2021/08/03/143724.html microsoft 2021-08-02 17:03:08
AWS AWS Partner Network (APN) Blog How to Modernize a Replatformed Mainframe Development Lifecycle with AWS and NTT DATA https://aws.amazon.com/blogs/apn/how-to-modernize-a-replatformed-mainframe-development-lifecycle-with-aws-and-ntt-data/ How to Modernize a Replatformed Mainframe Development Lifecycle with AWS and NTT DATATo increase business agility and keep pace with the rapid changes in the industry the development lifecycle for mainframe applications should be accelerated This post discusses how organizations who replatform their legacy mainframe applications to AWS using the NTT DATA UniKix product suite can implement a modern DevOps workflow with NTT DATA tools and select AWS services The CI CD pipeline explained in this post will help you speed up the development testing and release processes 2021-08-02 17:41:23
AWS AWS Meet Kelly, Senior Enterprise Business Manager at AWS Malaysia | Amazon Web Services https://www.youtube.com/watch?v=-Iaw24ugOXE Meet Kelly Senior Enterprise Business Manager at AWS Malaysia Amazon Web ServicesHere at AWS our teams provide customers with expertise supported by comprehensive cloud capabilities This is the best place to build learn Meet Kelly Senior Enterprise Business Manager at AWS Malaysia as she shares more Come build the future of tech with AWS Malaysia View open roles at AWS Subscribe More AWS videos More AWS events videos ABOUT AWSAmazon Web Services AWS is the world s most comprehensive and broadly adopted cloud platform offering over fully featured services from data centers globally Millions of customers ーincluding the fastest growing startups largest enterprises and leading government agencies ーare using AWS to lower costs become more agile and innovate faster AWS AmazonWebServices CloudComputing AWSCareers 2021-08-02 17:28:24
python Pythonタグが付けられた新着投稿 - Qiita [wave.py] Pythonで手っ取り早く音を生成してwav書き出しする https://qiita.com/inxisiv/items/11a4958d567f404f8909 wavepyPythonで手っ取り早く音を生成してwav書き出しする使うものPython組み込みモジュールのwavepyを使います。 2021-08-03 02:38:22
Program [全てのタグ]の新着質問一覧|teratail(テラテイル) PHP foreach 連想配列 の要素の取り出しについて https://teratail.com/questions/352477?rss=all PHPforeach連想配列の要素の取り出しについてphp初学者です。 2021-08-03 02:21:53
Program [全てのタグ]の新着質問一覧|teratail(テラテイル) Pythonから別のPythonファイルを起動する方法 https://teratail.com/questions/352476?rss=all discord 2021-08-03 02:18:11
Program [全てのタグ]の新着質問一覧|teratail(テラテイル) [Unity] UIボタンをクリックした際にジャンプさせたい https://teratail.com/questions/352475?rss=all UnityUIボタンをクリックした際にジャンプさせたいUIのButtonを押した際にプレイヤーを回のみジャンプさせたいです。 2021-08-03 02:11:40
Docker dockerタグが付けられた新着投稿 - Qiita macのDockerが遅い。cachedやdelegatedでも遅い。 https://qiita.com/amaike/items/d6210767d13ab4368f75 cachedやdelegatedでも遅い。 2021-08-03 02:21:12
海外TECH Ars Technica SpaceX installed 29 Raptor engines on a Super Heavy rocket last night https://arstechnica.com/?p=1784505 regulatory 2021-08-02 17:23:16
海外TECH DEV Community Binary Search - JavaScript | Plus Big O Performance Explained Simply https://dev.to/doabledanny/binary-search-javascript-plus-big-o-performance-explained-simply-3jbn Binary Search JavaScript Plus Big O Performance Explained SimplyThis article was originally published on DoableDanny comThe Binary Search algorithm is used to search for any element in a sorted array If the element is found it returns the element s index If not it returns Binary Search is a simple and intuitive algorithm that also has great performance it can find the value fast much faster than the Linear Search algorithm as used by the built in method indexOf when the array is large Binary Search StepsBinary search is a “divide and conquer type algorithm meaning it divides the array roughly in half every time that it checks whether a value is the one we are looking for Why It means that upon every check the data the algorithm has to work with gets halved upon each comparison allowing the value to be found much faster See the gif below demonstrating Binary and Linear Search The target is the value With Linear Search if the target is at the end of the array every single element has to be looped over and checked if it equals the target With Binary Search we Start in the middle and check if the target is greater or less than that middle value If the target is greater than the middle value we will next look at the second half of the array ignore the left side If the target is smaller we look at the first half of the array ignore the right side We pick the middle of that half and check if it s greater or less than our target Repeat this process until we find our target Example with target Start at middle even length array so middle Is greater than smaller than or equal to Greater so must be in the right half of the array Pick new middle Is greater than smaller than or equal to Equal Return the index of that element So with Binary Search the data set keeps getting divided in half until we find our target This tremendously decreases time complexity Binary Search in JavaScriptNow we understand the logic of Binary Search let s implement it in JavaScript function binarySearch arr target let start let end arr length while start lt end let middle Math floor start end if arr middle lt target Search the right half start middle else if arr middle gt target Search the left half end middle else if arr middle target Found target return middle Target not found return console log binarySearch console log binarySearch console log binarySearch console log binarySearch Here s what s going on The function accepts a sorted array and a target value Create a left pointer at the first element of the array and a right pointer at the last element of the array While the left pointer comes before the right pointer Create a pointer in the middle If the target is greater than the middle element move the left pointer up If the target is less than the middle element move the right pointer down If the target equals the middle return the index If the value isn t found return On each iteration of the while loop we are effectively discarding half of the array until we find our value or until we ve exhausted the array Binary Search time complexityWe will now discuss the performance of Binary Search in terms of Big O Notation If you re unfamiliar with Big O I strongly suggest that you check out this article I wrote Big O Notation in JavaScript The Ultimate Beginners Guide with Examples It s a very important topic and will undoubtedly make you a better developer Best case complexity of Binary SearchThe best case complexity of Binary Search occurs when the first comparison is correct the target value is in the middle of the input array This means that regardless of the size of the array we ll always get the result in constant time Therefore the best case time complexity is O constant time Worst case complexity of Binary SearchThe worst case complexity of Binary Search occurs when the target value is at the beginning or end of the array See the image below if we have an array elements long and our target is then the array will be divided five times until we find So the Big O complexity of binary search is O log n logarithmic time complexity log Average case complexity of Binary SearchThe average case is also of O log n Space complexity of Binary SearchBinary Search requires three pointers to elements start middle and end regardless of the size of the array Therefore the space complexity of Binary Search is O constant space Performance summary table Linear Search vs Binary SearchBinary Search has much better time complexity than Linear Search which has a Big O n linear time From the graph of Big O Notation below we can see that with larger input arrays Binary Search yellow line will take a lot less time to compute than Linear Search blue line It should be noted that Binary Search only works on sorted arrays The sorting step if using an efficient algorithm will have a time complexity of O nlog n Since Linear Search can work on sorted arrays if the array is small or if we need to search the array just once then Linear Search might be a better choice Binary Search is a great choice if we have to make multiple searches on large arrays For example if we have a large element array Linear Search would require comparisons at worst case Binary Search would require log comparisons That s a lot less If you Want to Master Algorithms If you want to further your knowledge of algorithms and data structures check out JavaScript Algorithms and Data Structures Masterclass by Colt Steele It s the best Udemy course I ve ever taken If you enjoyed this article you can say thanks by subscribing to my YouTube channel Also feel free to connect with me on Twitter Thanks for reading 2021-08-02 17:31:02
海外TECH DEV Community Flutter counter app, but using isolates https://dev.to/hrishiksh/flutter-counter-app-but-using-isolates-4e5j Flutter counter app but using isolatesAsynchronous and parallelism is two different things Let s take an example Our brain can do only one thing at a time It can t handle two different things at the same instant If you want to talk about multitasking here it is actually asynchronous in nature In multitasking if a job is taking more time then we move to another work and when the first job is done we come back to process the output from the first job It is simple right Till now it looks like asynchronous processing is the bast way to boost our productivity right Yeah but not right Nowadays multicore CPU are default in our machines it can do a lot of different things parallelly So to use this architecture we have to make our software in a way to use multicore or multiple thread But the thing is that flutter use only one thread and it does all of its work on a single thread In this thread it has to pump Frames Per Second to give a fluid experience to the end user If we do some heavy work or long running task in this thread Then flutter thread will be busy in the heavy work and as a result frame rates of our app will drop and the app looks sluggish or stutters So what to do How to do heavy lifting work in out flutter app There are many ways to this in flutter Like writing asynchronous code or using compute function But the best way I have found that to use an isolated If you haven t heard this name before don t worry I am here to help What is an Isolate in flutterIsolate is a container which is completely separate from flutter thread and don t share any memory with the app Isolate means a CPU thread which run in its own sandbox We can create an isolate from a flutter app and communicate with an isolate by passing messages to and fro with the isolate So enough of introduction let s see how to create and use isolates in flutter What we are going to makeHere we will use the basic flutter counter app and implement isolate to generate random numbers Just like the vanilla flutter counter app but with isolate Excited Let s dive in How to create an Isolate in flutterTo create Isolate in flutter we need to import dart isolate package We can start an isolate from our main thread with Isolate spawn method This is the entry point to our isolate and we have to pass a ReceivePort in Isolate spawn method to receive message from the isolate and listen the messages as a stream in the main thread Let us create a stateful widget named MyApp in flutter class MyApp extends StatefulWidget override MyAppState createState gt MyAppState class MyAppState extends State lt MyApp gt override Widget build BuildContext context return Scaffold appBar AppBar title Text Counter body Center child Text floatingActionButton FloatingActionButton onPressed child Icon Icons add Now I define the Isolate isolate variable in the top of the stateful widget As we need to listen the incoming values from the isolate inside the build function as a stream Then in the initState function in our widget we start our isolate and pass a ReceivePort overridevoid initState spawnIsolate super initState Future spawnIsolate async ReceivePort receivePort ReceivePort isolate await Isolate spawn remoteIsolate receivePort sendPort debugName remoteIsolate In the above code remoteIsolate is the entry point of the isolate We have defined a ReceivePort receivePort and send the receivePort sendPort to the created isolate This sendPort is the address of the main thread of the application We have given our isolate a name called remoteIsolate This will be helpful to debug our code later This remoteIsolate can use sendPort to send data to the main thread Now inside the remoteIsolate function which is the entry point of our isolate we define the working or functionality of our isolate static void remoteIsolate SendPort sendPort sendPort send Hi i am from remote Isolate We can listen to this message in our main thread using the ReceivePort receivePort listen message print message Till now I think you have understood the basic working of an Isolate and how to use ReceivePort and SendPort to communicate between isolate and the main thread Bidirectional communication between isolate and the main threadBidirectional communication between isolate and the main thread is very useful if we have to use the same isolate repeatedly To send a message from the main thread to our isolate we need the sendPort of the isolate To get that sendPort we the isolate is initialize for the first time we create a ReceivePort inside the isolate and send the sendPort to the main thread as a message Then we can use this sendPort of the isolate to send a message from our main thread Let s modify our remoteIsolate function from above static void remoteIsolate SendPort sendPort ReceivePort isolateReceivePort ReceivePort sendPort send isolateReceivePort sendPort Now we have to identify the sendPort from the rest of the message in our main thread receivePort listen message if message is SendPort message send SendPort from Isolate received Now we can use this sendPort to send any message to the isolate So we can now use bidirectional messaging in flutter isolate and can reuse the isolate over and over again Flutter counter app using IsolateAs I have promised before in this article This is the complete source code of isolate implementation in the flutter counter app import dart isolate import dart math import package flutter material dart void main runApp MaterialApp home MyApp class MyApp extends StatefulWidget override MyAppState createState gt MyAppState class MyAppState extends State lt MyApp gt ReceivePort receivePort Isolate isolate SendPort isolateSendPort static void remoteIsolate SendPort sendPort ReceivePort isolateReceivePort ReceivePort sendPort send isolateReceivePort sendPort isolateReceivePort listen message if message sendPort send Random nextInt Future spawnIsolate async receivePort ReceivePort isolate await Isolate spawn remoteIsolate receivePort sendPort debugName remoteIsolate override void initState spawnIsolate super initState override void dispose if isolate null isolate kill super dispose override Widget build BuildContext context return Scaffold appBar AppBar title Text Counter body Center child StreamBuilder stream receivePort initialData NoData builder BuildContext context AsyncSnapshot snapshot if snapshot data is SendPort isolateSendPort snapshot data return Container child Text snapshot data toString floatingActionButton FloatingActionButton onPressed isolateSendPort send child Icon Icons add If you have like this article leave a comment If you want to use a file picker in flutter you can read this article Follow me on Twitter to discuss cool new tech 2021-08-02 17:28:00
海外TECH DEV Community Localize Your Content And Succeed Globally https://dev.to/strapi/localize-your-content-and-succeed-globally-496o Localize Your Content And Succeed GloballyEach global region has its customs jokes and different expectations of how content should be presented Launching your product into another region requires adequate and precise information about the region you are targeting Creating content that can appeal to customers in their own culture is imperative as it leads to the success of your product in a different market Appreciate the cultural difference and localize your content for different regions that your product is present in What pleases a user in Johannesburg might not impress a user in Beijing Giant companies like McDonald s and Nike win more customers through their websites by meticulously localizing their website content We will cover the following points in this article Importance of localizing contentConsequences of not localizing your contentFactors of localizationBest practices of localizationIn conclusionAnd we will dive deep into factors and localization best practices that drive your website to localization success Importance of localizing contentLocalization maximizes your audience globally There are more than billion people who are using the internet daily Localizing your content to many regions allows you to win more customers Localization broadens product variety breaks cultural and language barriers It leads to an increase in sales and product success in different locations Consequences of not localizing your contentOnce your website is available in any country and the content is not localized Cultural differences and language become barriers Potential customers are lost due to this Users prefer websites and mobile applications that are thoroughly localized to their culture In the end your competitor and rivals gain more strength to win more users Once your competitor has built a strong relationship with users it gets harder to win customers in that region Preference to Native Language Statistics shows that of internet users prefer to visit websites in their native language Another stat shows that of online shoppers prefer the native language when available on a website And of online shoppers will surely purchase if the website is localized crystalhues Factors to consider when localizing content SEO strategyAmericans wear pants and play soccer while the British wear trousers and play football As you shift your content to focus on various global regions SEO strategies need to be revised and changed Using the exact keywords in different locations could lead to failure Use local keywords phrases and backlinks What is popular in one location won t be popular in another location As you localize your website it is imperative to revise your SEO strategy specifically for that location Many Chinese residents use Baidu as google and other American sites were blocked On the other hand Russians prefer the Yandex search engine Suppose you re targeting any region that doesn t use Google as its primary search engine You must prepare a list of keywords to enhance the website for those search engines This list will go far in helping you to optimize your website and search engine results page SERP You will also need to know your potential customer and their needs when searching for your website in that region Cultural sensitivityColors have different meanings across the globe In western culture the color white symbolizes purity and innocence primarily used in weddings While in eastern culture white symbolizes sadness and it is often linked to death It is essential to question your choices on content if it is relevant to the cultural status quo Talk to language experts or natives of the location to know more about a specific region Currency numeric formats and measurementsNothing frustrates me as using a website that does not use my currency Suddenly I have to start checking currency exchange rates online Whenever selling goods and providing services online use a specific currency for every particular country And finally review tax rates and laws Present any necessary information to the user regarding tax rates or the law People are willing to buy the key point when they don t have to do currency conversions and calculations to know your online product costs Numeric formats differ across global regions Optimize your website so that the number is suitable for the specific region and finally consider measurements In America they use Miles and other countries use Kilometers km A customer familiar with using Kilometres might underestimate the speed of m h while comparing it to a car with a speed of Km h Customer care serviceProduct documentation chatbots and customer call center have to play a role also Suppose you are going to localize your website Customers need the assurance to assist them in a language that they can understand if they experience trouble with your product Best practices of localization Keep track of localization resultsMonitor and track localization results effectively to change the content strategy whenever necessary Your implemented strategy might not perform well as you had expected So it is better to use language management systems to track performance rather than waiting for the user s feedback or change in sales reports to know the outcome of localizing your content Also the content strategy should clearly state a backup plan Adequate planningThe first step to localization is knowing which content should be localized What languages will be translated Create a plan and decision to translate specific languages Set a demographic target to avoid unnecessary work Above all localization should also be involved in the website design process at the beginning of the website development stage While planning ask yourself questions such as Am I taking the right approach How can I plan my SEO strategy What is my target market Did I network with the right people who know more about the targeted market Test your software and proofreadThe final content you have been working on has to be reviewed by professional translators and software UI developers before rolling out the product to the public The complete look of the localized content matters the most Does each piece of the content resonate well with the culture Localization testing involves Testing localization aspects such as content transition Linguistics Checking typography and verifying the cultural appropriateness of the user interface Externalize all translatable contentDo not hardcode strings It should translate strings and content to be stored in resource files In the future you won t have to break the strings again if you do this This will allow translation to be easily carried out and also make the translation procedure highly effective This will also improve teamwork as translators work on the resource file containing the resource files and the software developer focuses on internalization Using Unicode UTF encoding of strings and text expansionUse the UTF encoding when encoding strings This encoding supports all languages Leave plenty of space for text expansion when translating languages Translating English to Spanish adds more words and this affects the user interface Text contraction might also occur if you are translating English that has lower text length Keep this in mind ConclusionIf you want to succeed internationally you have to build websites and mobile applications that do not discriminate Appreciate cultural differences and implement localization best practices 2021-08-02 17:12:27
Apple AppleInsider - Frontpage News Steve Jobs and Apple Auction includes autograph, bomber jacket, Apple-1 https://appleinsider.com/articles/21/08/02/steve-jobs-and-apple-auction-includes-autograph-bomber-jacket-apple-1?utm_medium=rss Steve Jobs and Apple Auction includes autograph bomber jacket Apple Another Apple centric auction is spinning up that that boasts an Apple II manual signed by Steve Jobs and Mike Markkula Jobs famous leather jacket and plenty of other Apple memorabilia Among the items in the Boston MA based RR Auction listing is an Apple II Manual signed and inscribed to the son of UK Entrepreneur Michael Brewer Brewer negotiated exclusive distribution rights for Apple in the UK in The inscription on the manual reads Julian Your generation is the first to grow up with computers Go change the world steven jobs Read more 2021-08-02 17:54:07
Apple AppleInsider - Frontpage News Blackstone-backed firm to acquire Reese Witherspoon's Hello Sunshine https://appleinsider.com/articles/21/08/02/blackstone-backed-firm-to-acquire-reese-witherspoons-hello-sunshine?utm_medium=rss Blackstone backed firm to acquire Reese Witherspoon x s Hello SunshineAfter reportedly being considered for purchase by Apple Reese Witherspoon s Hello Sunshine media company which is behind a number of Apple TV originals is selling itself to a Blackstone backed firm Credit Hello SunshineAlthough the terms of the deal haven t been disclosed The Wall Street Journal reported Tuesday that it values Hello Sunshine at about million The name of the media company that Blackstone is backing is also unknown but will be run by former Disney executives Kevin Mayer and Tom Staggs Read more 2021-08-02 17:27:11
Apple AppleInsider - Frontpage News Google teases Pixel 6, Pixel 6 Pro with new Tensor AI chip https://appleinsider.com/articles/21/08/02/google-teases-pixel-6-pixel-6-pro-with-new-tensor-ai-chip?utm_medium=rss Google teases Pixel Pixel Pro with new Tensor AI chipGoogle has teased its Pixel and Pixel Pro smartphones ahead of a launch later in smartphones using a new custom designed Tensor chip Google s Pixel and Pixel Pro Posted to Twitter on Monday the thread offered quite a few details about Google s upcoming smartphone launches replacing the current generation Pixel series Rather than a complete device breakdown the posts focused on the imaging capabilities as well as its processor Read more 2021-08-02 17:15:14
Apple AppleInsider - Frontpage News How to use tags in Reminders iOS 15 https://appleinsider.com/articles/21/08/02/how-to-use-tags-in-reminders-ios-15?utm_medium=rss How to use tags in Reminders iOS Apple s Reminders has always been a solid basic To Do app but in iOS it takes another step forward by introducing tags as an extra level of organization You can now tag Reminders to help find tasks laterYou have to imagine that there are entirely different teams working on Apple s range of apps but if there is they have a lot of lunches together For the latest additions in Apple s updated Reminders app is the same tagging system you can see in Quick Notes and Apple Notes Read more 2021-08-02 17:04:20
海外TECH Engadget Sony adds 'Nier: Automata,' 'Ghostrunner' and 'Undertale' to PlayStation Now https://www.engadget.com/sony-playstation-now-nier-automata-ghostrunner-undertale-august-3-173510738.html?src=rss Sony adds x Nier Automata x x Ghostrunner x and x Undertale x to PlayStation NowSony is adding Nier Automata Ghostrunner and Undertale to its PlayStation Now service the company announced on Monday Subscribers can play all three games starting on August rd with Nier Automata only available on the service until November st While none of the titles Sony is adding tomorrow are exactly new or for that matter exclusive to PlayStation Now they re smart additions to the platform s library all the same Nier and Undertale are particularly well regarded and just niche enough that not everyone has gone out of their way to play them As Sony looks for ways to counter Xbox Game Pass PlayStation Now subscribers can look forward to more high profile games making their way to the platform Back in May the company said it was working on “strengthening the service by investing in and partnering with external studios 2021-08-02 17:35:10
海外TECH Engadget Bandcamp will keep waiving its fees one day a month through the end of 2021 https://www.engadget.com/bandcamp-fridays-2021-artists-labels-revenue-172054991.html?src=rss Bandcamp will keep waiving its fees one day a month through the end of Since March when the COVID pandemic really took hold in most of the world Bandcamp has waived its commissions on the first Friday of each month The Bandcamp Friday initiative is now set to continue through the end of the year The next edition takes place on August th Almost music fans have participated in Bandcamp Fridays paying artists and labels million in the process On Bandcamp Fridays artists and labels receive approximately percent of sales revenue after payment processor fees On every other day of the month artists and labels still receive approximately percent of revenue from purchases Bandcamp says Although the world is slowly returning to a semblance of normality and musicians can once again play shows in some regions the long term financial impact of the pandemic is something artists and labels particularly smaller ones are still dealing with Initiatives like Bandcamp Fridays could put some extra cash in their pockets to alleviate some of the strain so it s good to see Bandcamp continuing the effort for another few months 2021-08-02 17:20:54
海外TECH Engadget SkulptSynth SE review: Cheap and confusing, but incredibly powerful https://www.engadget.com/modal-electronics-skulptsynth-se-review-virtual-analog-170031324.html?src=rss SkulptSynth SE review Cheap and confusing but incredibly powerfulOver the last decade or so we ve seen an explosion of cheap portable synths driven largely by Korg s deceptively powerful Volca line Even companies like Modal Electronics which typically trafficked in higher end instruments that started at around suddenly felt pressure to compete in the sub range First it dabbled with two simplistic DIY kits the CraftSynth and CraftRhythms But then in it brought the Skulpt to Kickstarter followed shortly thereafter by the CraftSynth These were much more serious shots at the entry level But where both impressed with their sound the build quality was poor and the original Skulpt was a little overpriced at The SkulptSynth SE attempts to address those concerns It s quite a bit cheaper at just putting it more in line with the Volcas and the build quality is sturdier too But under the hood it s largely the same instrument and the question is whether it s as compelling now as it was three years ago Alright I won t make you wait The answer is yes The Skulpt sounds great Sure plenty of other synths have come and gone over the last three years especially at the cheaper end of the spectrum but that doesn t take away from the excellent virtual analog engine here There s a total of oscillators stacked in four voices though using the spread function you can expand that to with two different waves per voice This gives the Skulpt a thick tone that while you probably won t mistake it for true analog is still plenty inviting While it s perfectly capable of handling basslines and lead duty the Skulpt really shines when it comes to pads where those stacked oscillators really flesh out the sound The Skulpt has a wealth of modulation options too that give it a surprising amount of depth for something so small and cheap There are dedicated filter and amp envelopes but also a mod envelope and a pair of LFOs one of which is polyphonic Those last three can be assigned to a host of different destinations with a total of eight modulation slots available In addition there s ring modulation FM and pulse width modulation options plus a morphing filter that goes from lowpass to bandpass then highpass not to mention built in delay and distortion effects There are far more sound design tools here than you d have any right to expect for Terrence O Brien EngadgetThe wealth options are great but actually navigating those controls is a bit difficult The unit comes with a cheat sheet and you ll always want it on hand While the front plate has simpler less confusing graphics than the original Skulpt the SE still isn t exactly intuitive While there are lines that show you what controls are connected to each other they re not laid out in any sort of obviously logical way And the tiny orange and white text labels feel crowded and hard to read at times While the layout is visually interesting it s also infuriating This a place where form clearly won out over function and not for the better Another thing that s improved on the SE is the build quality But just like the panel design it s not as huge an upgrade as you might hope The knobs feel a little firmer and have slightly more resistance than on the Craft but they re still pretty wobbly and cheap At least they don t slide right off the encoders with a gentle tug though Engadget ·Modal Electronics SkulptSynth SE sound samplesI m not sure how much better the overall build is than the original Skulpt and it s only marginally better than the Craft In general the Skulpt SE still feels chintzy But it does come with a cover that will help protect it in a bag which is more than I can say for the Craft Sadly though the touch keyboard is as bad as ever It s not always super responsive and the layout somehow manages to feel both cramped and sprawling all at the same time Playing simple triads required an uncomfortable amount of stretching and I could rarely play a chord progression without accidentally triggering at least one stray note Volca keyboards are certainly nothing to get excited about but they make the Modal touchstrips feel like cheap imposters Terrence O Brien EngadgetOne final gripe about the physical design The Skulpt has full sized MIDI in and out ports but a mm lineout jack While full size MIDI DINs are appreciated I d rather see ¼ inch audio outs and smaller TRS MIDI jacks if I had to choose And honestly on something that s battery powered and portable TRS MIDI just makes more sense Save the space A lot of my complaints about the interface and unimpressive build quality can be ignored though if you just use the app Stick the Skulpt someplace out of the way and connect it to your computer or phone over USB and you re set Then you can control it via MIDI over USB and do all your patching from within the Modal app It s much easier than using the device itself It even has a VST version so you can control the Skulpt from your DAW but I had some issues getting it running on my Windows PC The Skulpt is usable without the app unlike the Craft But honestly this is probably the way you want to use the Skulpt anyway While the whole battery powered portable thing is nice what s more exciting is its support for MPE which is basically unheard of at this price point But the built in keys do not support MPE velocity or aftertouch you ll need to use an external controller Having access to aftertouch and polyphonic expression gives the Skulpt much more life As a digital synth trying to emulate analog it can sound a little cold at times but the small fluctuations that come with aftertouch and MPE make the synth feel more organic Just be warned that dialing in MPE controls requires a bit of trial and error definitely turn down the pitch bend range in the settings Terrence O Brien EngadgetThis also gives you more room to experiment with your sound design in interesting ways For example in one patch I set the mod wheel sliding your fingers along the Y axis on the Sensel Morph to change the wave shape and aftertouch to control the filter cutoff Then I could play a chord and slowly slide my fingers up on the highernotes causing them to fizzle out as the wave morphed from saw to noise but keep the bass note droning cleanly And then I could adjust the balance between these two sounds simply by pressing harder or softer on the right keys This isn t necessarily groundbreaking but it s definitely impressive on a synth that costs just This does make me dream of an upgraded CraftSynth though As much as I enjoy the Skulpt I prefer the sonic palette of the wavetable based Craft a bit more I d love to see Modal release a polyphonic version of it that supports MPE Basically give us the best of both synths in a single device There are so many options out there at the entry level for synths these days it s hard to say there is one that is best for most people The Skulpt SE certainly wouldn t be a bad choice though If you re truly just starting to learn synthesis something more straightforward like the Volca Keys might be a good option But it doesn t have the depth of the Skulpt If you re willing to spend a bit more Arturia s Microfreak has even more sound design options to get lost in Plus it regularly gets new features and sounds The only issue is it s even more complicated than the Skulpt though its controls are much easier to navigate The SkulptSynth SE shows that Modal is serious about playing at the entry level The company has delivered an excellent sounding instrument with a wealth of features at an impressive price If it ever figures out how to design an interface that doesn t make you want to rip your hair out Korg s grip on the budget synth market may be in jeopardy 2021-08-02 17:00:31
ニュース ジェトロ ビジネスニュース(通商弘報) 第2四半期の輸出入、輸出は過去最高額を記録 https://www.jetro.go.jp/biznews/2021/08/a5ee4e0aa5089092.html 過去最高 2021-08-02 17:30:00
ニュース ジェトロ ビジネスニュース(通商弘報) 6月の失業率はEUで前月比0.2ポイント改善、ユーロ圏で0.3ポイント改善 https://www.jetro.go.jp/biznews/2021/08/76df9efac5cf1977.html 失業率 2021-08-02 17:30:00
ニュース ジェトロ ビジネスニュース(通商弘報) LGエレクトロニクス、第2四半期の売上高が過去最大に https://www.jetro.go.jp/biznews/2021/08/12f38f0ed20fa004.html 過去 2021-08-02 17:20:00
ニュース ジェトロ ビジネスニュース(通商弘報) 2021年1~6月の貿易総額は2割増に https://www.jetro.go.jp/biznews/2021/08/1853013235b78d03.html 貿易 2021-08-02 17:10:00
ニュース BBC News - Home NHS Covid-19 app in England and Wales tweaked to notify fewer contacts https://www.bbc.co.uk/news/uk-58062180 asymptomatic 2021-08-02 17:41:42
ニュース BBC News - Home Covid: PM defends approach to international travel https://www.bbc.co.uk/news/uk-58059614 variants 2021-08-02 17:17:52
ニュース BBC News - Home Bridgend river death: Boy, 5, named as Logan Mwangi https://www.bbc.co.uk/news/uk-wales-58049509 williamson 2021-08-02 17:40:46
ニュース BBC News - Home I'm a Celebrity...Get Me Out Here! to return to Wales in 2021 https://www.bbc.co.uk/news/uk-wales-58053077 november 2021-08-02 17:49:20
GCP Cloud Blog Image search with natural language queries https://cloud.google.com/blog/topics/developers-practitioners/image-search-natural-language-queries/ Image search with natural language queriesThis post shows how to build an image search utility using natural language queries Our aim is to use different GCP services to demonstrate this At the core of our project is OpenAI s CLIP model It makes use of two encoders one for images and one for texts Each encoder is trained to learn representations such that similar images and text embeddings are projected as close as possible We will first create a Flask based REST API capable of handling natural language queries and matching them against relevant images We will then demonstrate the use of the API through a Flutter based web and mobile application Figure shows how our final application would look like Figure Final application overview All the code shown in this post is available as a GitHub repository Let s dive in  Application at a high levelOur application will take two queries from the user Tag or keyword query This is needed in order to pull a set of images of interest from Pixabay You can use any other image repositories for this purpose But we found Pixabay s API to be easier to work with We will cache these images to optimize the user experience Suppose we wanted to find images that are similar to this query “horses amidst flowers For this we d first pull in a few “horse images and then run another utility to find out the images that best match our query   Longer or semantic query that we will use to retrieve the images from the pool created in the step above These images should be semantically similar to this query  Note Instead of two queries we could have only taken a single long query and run named entity extraction to determine the most likely important keywords to run the initial search with For this post we won t be using this approach  Figure below depicts the architecture design of our application and the technical stack used for each of the components Figure Architecture design and flow Figure also presents the core logic of the API we will develop in bits and pieces in this post We will deploy this API on a Kubernetes cluster using the Google Kubernetes Engine GKE The following presents a brief directory structure of our application code base Next we will walk through the code and other related components for building our image search API For various machine learning related utilities we will be using PyTorch  Building the backend API with FlaskFirst we d need to fetch a set of images with respect to user provided tags keywords before performing the natural language image search The utility below from the pixabay utils py script can do this for us Note that all the API utilities are logging relevant information But for brevity we have omitted the lines of code responsible for that Next we will see how to invoke the CLIP model and select the images that would best match a given query semantically  For this we ll be using Hugging Face an easy to use Python library offering state of the art NLP capabilities We ll collate all the logic related to this search inside a SimilarityUtil class CLIP MODEL uses a ViT base model to encode the images for generating meaningful embeddings with respect to the provided query The text based query is also encoded using A Transformers based model for generating the embeddings These two embeddings are matched with one another during inference To know more about the particular methods we are using for the CLIP model please refer to this documentation from Hugging Face  In the code above we are first invoking the CLIP model with images and the natural language query This gives us a vector logits per image that contains the similarity scores between each of the images and the query We then sort the vector in a descending manner Note that we are initializing the CLIP model while instantiating the SimilarityUtil to save us the model loading time This is the meat of our application and we have tackled it already If you want to interact with this utility in a live manner you can check out this Colab Notebook  Now we need to collate our utilities for fetching images from Pixabay and for performing the natural language image search inside a single script perform search py Following is the main class of that script Here we are just calling the utilities we had previously developed to return the URLs of the most similar images and their scores What is even more important here is the caching capability For that we combined GCP s MemoryStore and a Python library called direct redis More on setting up MemoryStore later  MemoryStore provides a fully managed and low cost platform for hosting Redis instances Redis databases are in memory and light weight making them an ideal candidate for caching In the code above we are caching the images fetched from Pixabay and their URLs So in the event of a cache hit we won t need to call the CLIP model and this will tremendously improve the response time of our API  Other options for cachingWe can cache other elements of our application For example the natural language query When searching through the cached entries to determine if it s a cache hit we can compare two queries for semantic similarity and return results accordingly  Consider that a user had entered the following natural language query “mountains with dark skies After performing the search we d cache the embeddings of this query Now consider that another user entered another query “mountains with gloomy ambiance We d compute its embeddings and run a similarity search with the cached embeddings We d then compare the similarity scores with respect to a threshold and parse the most similar queries and their corresponding results In case of a cache miss we d just call the image search utilities we developed above  When working on real time applications we often need to consider these different aspects and decide what enhances the user experience and maximizes business at the same time  All that s left now for the backend is our Flask application main py Here we are first parsing the query parameters from the request payload of our search API  We are then just calling the appropriate function from perform search py to handle the request This Flask application is also capable of handling CORS We do this via the flask cors library And this is it Our API is now ready for deployment  Deployment with Compute Engine and GKEThe reason why we wanted to deploy our API on Kubernetes is because of the flexibility Kubernetes offers for managing deployments When operating at scale auto scalability and load balancing are very important With the comes the requirement of security we d not want to expose the utilities for interacting with any internal services such as databases With Kubernetes we can achieve all these easily and efficiently  GKE provides secured and fully managed functionalities for operationalizing Kubernetes clusters Here are the steps to deploy the API on GKE at a glance We first build a Docker image for our API and then push it to the Google Container Registry GCR We then create a Kubernetes cluster on GKE and initialize a deployment We then add scalability options If any public exposure is needed for the API we then tackle it  We can assimilate all the above into a shell script ks deploy sh These steps are well explained in this tutorial that you might want to refer to for more details We can configure all the dependencies on our local machine and execute the shell script above We can also use the GCP Console to execute it since a terminal on the GCP Console is pre configured with the system level dependencies we d need In reality the Kubernetes cluster should only be created once and different deployment versions should be created under it  After the above shell script is run successfully we can run kubectl get service to know the external IP address of the service we just deployed We can now consume this API with the following base URI If we wanted to deal with only http based API requests then we are done here But secured communication is often a requirement in order for applications to operate reliably In the following section we are to discuss how to configure the additional items to allow our Kubernetes cluster to allow https requests   Configurations for handling https requests with GKEA secure connection is almost often a must have requirement in modern client server applications The front end Flutter application would be hosted on GitHub Pages for this project and it requires https based connection as well Even if configuring https connection particularly for a GKE based cluster can be considered a chore its setup might seem daunting at first There are six steps to configure https connection in the GKE environment  You need to have a domain name and there are a lot of inexpensive options that you can buy For instance mlgde com domain for this project is acquired via Gabia which is a Korean service provider A reserved static external IP address has to be acquired via gcloud command or GCP console  You need to bind the domain name with the acquired external IP address This is a platform specific configuration that issued the domain name to you  There is a special ManagedCertificate resource which is specific to the GKE environment ManagedCertificate resource specifies the domain that the SSL certificate will be created for so you need this  An Ingress resource should be created by listing the static external IP address ManagedCertificate resource and the service name and port which the incoming traffic will be routed to The Service resource could remain the same as in the above section with only changes from LoadBalancer to ClusterIP  Last but not least you need to modify the existing Flask application and Deployment resource to support liveness and readiness probes which are used to check the health status of the Deployment The Flask application side can be simply modified with the flask healthz Python package and you only need to add livenessProbe and readinessProbe sections in the Deployment resource In the code example below the livenessProbe and readinessProbe are checked via alive and ready endpoints respectively One thing to be careful of is the initialDelaySeconds attribute of the probes It is uncommon to configure this attribute with a big number but it could be bigger than seconds depending on the size of the model to be used For this project it is configured in seconds in order to wait until the CLIP model is fully loaded into memory full YAML script here Again these steps may seem daunting at first but it will become clear when you have done it once Here is the official document for Using Google managed SSL certificates You can find all the GKE related resources used in this project here  Once every step is completed you should be able to see your server application running on the GKE environment Please make sure to run kubectl apply command whenever you create Kubernetes resources such as Deployment Service Ingress and ManagedCertificate and it is important to wait for more than minutes until the ManagedCertifcate provisioning is done  You can run gcloud compute addresses list command to find out the static external IP address that you have configured Then the IP address has to be mapped to the domain Figure is a screenshot of a dashboard from where we got the mlgde com domain It clearly shows mlgde com is mapped to the static external IP address configured in GCP Figure API endpoints mapped to our custom domain In case you re wondering why we didn t deploy this application on App Engine well that is because of the compute needed to execute the CLIP model App Engine instance won t fit in that regime We could have also incorporated compute heavy capabilities via a VPC Connector That is a design choice that you and your team would need to consider In our experiments we found the GKE deployment to be easier and suitable for our needs  Infrastructure for the CLIP modelAs mentioned earlier at the core of our application is the CLIP model It is computationally a bit more expensive than the regular deep learning models This is why it makes sense to have the hardware infrastructure set up accordingly to execute it We ran a small benchmark in order to see how a GPU based environment could be beneficial here  We ran the CLIP on a Tesla P based machine and also on a standard CPU only machine times The code snippet below is the meat of what we executed As somewhat expected with the GPU the code took minutes to complete execution With no GPU it took about minutes It is uncommon to leverage GPUs for model prediction because of cost restrictions but sometimes we have to access GPUs for deploying a big model like CLIP We configured a GPU based cluster on GKE and compared the performance differences with and without it It took about second to handle a request with GPU and MemoryStore cache while it took more than seconds with MemoryStore only without the GPUs  For the purposes of this post we used a CPU based cluster on Kubernetes But It is easy to configure GPU usage in a GKE cluster This document shows you how to do so For a short summary there are two steps First a node should be configured with GPUs when creating a GKE cluster Second GPU drivers should be installed in GKE nodes You don t need to visit and manually install GPU drivers for each node by yourself Rather you can simply apply the DaemonSet resource to GKE as described here Setting up MemoryStoreIn this project we first query the general concept of images to Pixabay then we filter the images with a semantic query using CLIP It means we can cache the initially retrieved images from Pixabay for the next specific semantic query For instance you may want to search with “gentleman wearing tie at first then you may want to retry searching for “gentleman wearing glass In this case the base images remain all the same so they could be stored in a cache server like Redis  MemoryStore is a GCP service wrapping the Redis which is an in memory data store so you can simply use a standard Redis Python package for accessing it The only thing to be careful about when provisioning a MemoryStore Redis instance is to make sure it is in the same region where your GKE cluster or Compute Engine instance is Figure MemoryStore setup The code snippet below shows how to make a connection to the Redis instance in Python Nothing specific to GCP but you only need to be aware of the usage of the standard redis py package  After creating a connection you can store and retrieve data from MemoryStore There are more advanced use cases of Redis but we only used exists get and set methods for the demonstration purpose These methods should be very familiar if you know maps dictionaries or other similar data structures For the code portion that uses Redis related utilities please refer to the Searcher Python class we discussed in an earlier section  In the URLs below you can find side by side comparisons of using MemoryStore Without MemoryStore With MemoryStore st try With MemoryStore nd try  Putting everything togetherAll that s left now is to collate the different components we developed in the sections above and deploy our application with a frontend All the frontend related code is present here  The front end application is written in the Flutter development kit The main screen contains two text fields for queries to Pixabay and CLIP model respectively When you click the “Send Query button it will send out a RestAPI request to the server After receiving the result back from the server the retrieved images from the semantic query will be displayed at the bottom section of the screen  Please note that a Flutter application can be deployed to various environments including desktop web iOS and Android In order to keep as simple as possible we chose to deploy the application to the GitHub Pages Whenever there is any change to a client side source directory the GitHub Action will be triggered to build a web page and deploy the latest version to the GitHub Pages  Our final application is deployed here and it looks like so Figure Live application screen Note that due to constraints the above mentioned URL will only be live for one or two months   It is also possible to redeploy the back end application with a GitHub Action  The very first step is to craft a Dockerfile like below Since Python is a scripting language and there are lots of heavy packages that the application is dependent on it is important to cache the steps For instance installing the dependencies should be separated from other commands With the Dockerfile defined we can use a GitHub Action like this for automatic deployment  Edge casesSince the CLIP model is pre trained on a large corpus of image and text pairs it s likely that it may not generalize well to every natural language query we throw at it Also because we are limiting the number of images on which the CLIP model can operate this somehow restricts the expressivity of the model We may be able to improve the performance for the second situation by increasing the number of images to be pre fetched and by indexing them into a low cost and high performance database like Datastore  CostsIn this section we wanted to provide the readers a breakdown of the costs they might incur in order to consume the various services used throughout the application  Frontend hostingThe front end application is hosted on GitHub Pages so there is no expenditure for this Compute EngineWith an e standard instance type without GPUs the cost is around per month In case you want to add a GPU NVIDIA K the cost goes up to per month MemoryStoreThe cost for MemoryStore depends on the size With GB of space the cost is around per month and whenever you add more GBs the cost will be doubled Google Kubernetes EngineThe monthly cost for a node GKE cluster with n standard vCPUs RAM GB without GPUs is about If you add one GPU NVIDIA K to the cluster the cost goes up to While you may think that is a lot cost wise it is good to know that Google gives away free credits when you create a new GCP account It is still not enough for leveraging GPUs but it is enough to learn and experiment with GKE and MemoryStore usage ConclusionIn this post we walked through the components needed to build a basic image search utility for natural language queries We discussed how these different components are connected to each other Our image search API is able to utilize caching and was deployed on a Kubernetes cluster using GKE These elements are essential when building a similar service to cater to a much bigger workload We hope this post will serve as a good starting point for that purpose Below are some references on similar areas of work that you can explore Building a real time embeddings similarity matching systemDetecting image similarity using Spark LSH and TensorFlowAcknowledgments We are grateful to the Google Developers Experts program for supporting us with GCP credits Thanks to Karl Weinmeister and Soonson Kwon of Google for reviewing the initial draft of this post 2021-08-02 17:30:00

コメント

このブログの人気の投稿

投稿時間:2021-06-17 05:05:34 RSSフィード2021-06-17 05:00 分まとめ(1274件)

投稿時間:2021-06-20 02:06:12 RSSフィード2021-06-20 02:00 分まとめ(3871件)

投稿時間:2020-12-01 09:41:49 RSSフィード2020-12-01 09:00 分まとめ(69件)