投稿時間:2023-04-05 20:30:36 RSSフィード2023-04-05 20:00 分まとめ(38件)

カテゴリー等 サイト名等 記事タイトル・トレンドワード等 リンクURL 頻出ワード・要約等/検索ボリューム 登録日
IT ITmedia 総合記事一覧 [ITmedia News] 子どものわいせつな自撮りをAIで検知する「コドマモ」登場 他のカメラアプリ使っても判別 https://www.itmedia.co.jp/news/articles/2304/05/news177.html itmedia 2023-04-05 19:34:00
IT ITmedia 総合記事一覧 [ITmedia News] 「プロンプトエンジニアリング」の“教科書”、日本語版が登場 無償でAIの上手な使い方を解説 https://www.itmedia.co.jp/news/articles/2304/05/news175.html chatgpt 2023-04-05 19:32:00
TECH Techable(テッカブル) プロも使用するヘッドホン、マイク、イヤホンの 【beyerdynamic】が最新シリーズ機種を販売開始 https://techable.jp/archives/201882 beyerdynamic 2023-04-05 10:00:13
python Pythonタグが付けられた新着投稿 - Qiita 【exe化】Python環境構築無しでPythonスクリプトを実行する方法 https://qiita.com/Soysoy11110000/items/bf4cec4f17b1df1ae8c2 pyinstaller 2023-04-05 19:23:41
python Pythonタグが付けられた新着投稿 - Qiita dockerでStableDiffusionを動かしてみる https://qiita.com/T08Y02/items/12cfecd20b3ec442880a docker 2023-04-05 19:17:24
Ruby Rubyタグが付けられた新着投稿 - Qiita conoha wingでruby-buildして新しいrubyを入れてみる https://qiita.com/active765/items/4a80c4852dbc5e334836 conohawing 2023-04-05 19:44:07
Linux Ubuntuタグが付けられた新着投稿 - Qiita Apache2 on Docker https://qiita.com/NeK/items/51bbee008f11f0c62e0e apache 2023-04-05 19:34:24
Linux Ubuntuタグが付けられた新着投稿 - Qiita WSLでubuntuを使うための準備いろいろ https://qiita.com/If_it_bleeds-we_can_kill_it/items/3ce21243f3caceeb4a81 macbook 2023-04-05 19:13:15
Docker dockerタグが付けられた新着投稿 - Qiita dockerでStableDiffusionを動かしてみる https://qiita.com/T08Y02/items/12cfecd20b3ec442880a docker 2023-04-05 19:17:24
Azure Azureタグが付けられた新着投稿 - Qiita Azure Automation実行アカウントをマネージドIDに移行する https://qiita.com/shingo_kawahara/items/f46b6a4d2122221c2417 azureautomation 2023-04-05 19:41:15
海外TECH DEV Community Web Scraping in Python: Avoid Detection Like a Ninja https://dev.to/zenrowshq/web-scraping-in-python-avoid-detection-like-a-ninja-2op1 Web Scraping in Python Avoid Detection Like a NinjaScraping should be about extracting content from HTML It sounds simple but has many obstacles The first one is to obtain the said HTML For that we ll use Python to avoid detection That might require bypassing anti bot systems Web scraping without getting blocked using Python or any other tool isn t a walk in the park Websites tend to protect their data and access Defensive systems could take many possible actions Stay with us to learn how to mitigate their impact or directly bypass bot detection using Requests or Playwright Note when testing at scale never use your home IP directly A small mistake or slip and you will get banned PrerequisitesFor the code to work you ll need python installed Some systems have it pre installed After that install all the necessary libraries by running pip install pip install requests playwright npx playwright install IP Rate LimitThe most basic security system is to ban or throttle requests from the same IP It means a regular user wouldn t request a hundred pages in a few seconds so that connection will be tagged as dangerous import requests response requests get print response json origin xyz IP rate limits work similarly to API rate limits but there is usually no public information about them We can t know for sure how many requests we can make safely Our Internet Service Provider assigns us our IP which we can t affect or mask The solution is to change it We can t modify a machine s IP but we can use different machines Datacenters might have different IPs although that isn t a real solution Proxies are They take an incoming request and relay it to the final destination It does no processing there But that is enough to mask our IP and bypass the block since the target website will see the proxy s IP Rotating ProxiesThere are free proxies even though we don t recommend them They might work for testing but aren t reliable We can use some of those for testing as we ll see in some examples Now we have a different IP and our home connection is safe and sound Good But what if they block the proxy s IP We re back to the initial position We won t go into detail about free proxies Just use the next one on the list Change them frequently since their lifespan is usually short Paid proxy services on the other hand offer IP rotation Our service would work the same but the website would see a different IP In some cases they rotate for every request or every few minutes In any case they re much harder to ban And when it happens we ll get a new IP after a short time import requests proxies http response requests get proxies proxies print response json origin We know about these it means bot detection services also know about them Some big companies will block traffic from known proxy IPs or datacenters For those cases there is a higher proxy level residential Residential proxies are more expensive and sometimes bandwidth limited but they offer us IPs used by regular people That implies that our mobile provider could assign us that IP tomorrow Or a friend had it yesterday They re indistinguishable from actual final users We can scrape whatever we want right The cheaper ones by default the expensive ones when necessary No not there yet We only passed the first hurdle with some more to come We must look like legitimate users to avoid being tagged as a bot or scraper User Agent HeaderThe next step would be to check our request headers The most known one is User Agent UA for short but there are many more UA follows a format we ll see later and many software tools for example GoogleBot have their own Here is what the target website will receive if we directly use Python Requests or cURL import requests response requests get print response json headers User Agent python requests curl http httpbin org headers User Agent curl Many sites won t check UA but this is a huge red flag for the ones that do this We ll have to fake it Luckily most libraries allow custom headers Following the example using Requests import requests headers User Agent Mozilla X Linux x AppleWebKit KHTML like Gecko Chrome Safari response requests get headers headers print response json headers User Agent Mozilla To get your current user agent visit httpbin just as the code snippet is doing and copy it Requesting all the URLs with the same UA might also trigger some alerts making the solution a bit more complicated Ideally we would have all the current possible User Agents and rotate them as we did with the IPs Since that is nearly impossible we can at least have a few There are lists of User Agents available for us to choose from import requests import random user agents Mozilla Windows NT Win x AppleWebKit KHTML like Gecko Chrome Safari Mozilla X Linux x AppleWebKit KHTML like Gecko Chrome Safari Mozilla Windows NT Win x AppleWebKit KHTML like Gecko Chrome Safari Mozilla iPhone CPU iPhone OS like Mac OS X AppleWebKit KHTML like Gecko Mobile E Mozilla Linux Android SM GU AppleWebKit KHTML like Gecko Chrome Mobile Safari user agent random choice user agents headers User Agent user agent response requests get headers headers print response json headers User Agent Mozilla iPhone CPU iPhone OS like Mac OS X Keep in mind that browsers change versions quite often and this list can be obsolete in a few months If we re to use User Agent rotation a reliable source is essential We can do it by hand or use a service provider We are a step closer but there is still one flaw in the headers anti bot systems also know this trick and check other headers along with the User Agent Full Set of HeadersEach browser or even version sends different headers Check Chrome and Firefox in action Accept text html application xhtml xml application xml q image avif image webp image apng q application signed exchange v b q Accept Encoding gzip deflate br Accept Language en US en q Host httpbin org Sec Ch Ua Chromium v Not A Brand v Google Chrome v Sec Ch Ua Mobile Sec Fetch Dest document Sec Fetch Mode navigate Sec Fetch Site none Sec Fetch User Upgrade Insecure Requests User Agent Mozilla X Linux x AppleWebKit KHTML like Gecko Chrome Safari Accept text html application xhtml xml application xml q image webp q Accept Encoding gzip deflate br Accept Language en US en q Host httpbin org Sec Fetch Dest document Sec Fetch Mode navigate Sec Fetch Site none Sec Fetch User Upgrade Insecure Requests User Agent Mozilla X Ubuntu Linux x rv Gecko Firefox It means what you think it means The previous array with five User Agents is incomplete We need an array with a complete set of headers per User Agent For brevity we ll show a list with one item It s already long enough In this case copying the result from httpbin isn t enough The ideal would be to copy it directly from the source The easiest way to do it s from Firefox or Chrome DevTools or the equivalent in your browser Go to the Network tab visit the target website right click on the request and Copy as cURL Then convert the cURL syntax to Python and paste the headers into the list import requests import random headers list authority httpbin org cache control max age sec ch ua Chromium v Not A Brand v Google Chrome v sec ch ua mobile upgrade insecure requests user agent Mozilla X Linux x AppleWebKit KHTML like Gecko Chrome Safari accept text html application xhtml xml application xml q image avif image webp image apng q application signed exchange v b q sec fetch site none sec fetch mode navigate sec fetch user sec fetch dest document accept language en US en q headers random choice headers list response requests get headers headers print response json headers We could add a Referer header for extra security such as Google or an internal page from the same website It would mask the fact that we always request URLs directly without interaction  But be careful since adding a referrer would change more headers You don t want your Python Request script blocked by mistakes like that CookiesWe ignored the cookies above since they deserve a separate section Cookies can help you bypass some anti bots or get your requests blocked They re a powerful tool that we need to understand correctly For example cookies can track a user session and remember that user after login Websites assign each new user a cookie session There are many ways to do it but we ll try to simplify Then the user s browser will send that cookie in each request tracking the user s activity How is that a problem We are using rotating proxies so each request might have a different IP from different regions or countries Anti bots can see that pattern and block it since it s not a natural way for users to browse On the other hand once you bypass the anti bot solution it ll send valuable cookies Defensive systems won t check twice if the session looks legit Check out how to bypass Cloudflare for more information Will cookies help our Python Requests scripts to avoid bot detection Or will they hurt us and get us blocked The answer lies in our implementation For simple cases not sending cookies might work best There is no need to maintain a session For more advanced cases and anti bot software session cookies might be the only way to reach and scrape the final content Always taking into account that the session requests and the IP must match The same happens if we want content generated in the browser after XHR calls We ll need to use a headless browser After the initial load the JavaScript will try to get some content using an XHR call We can t do that call without cookies on a protected site How will we use headless browsers specifically Playwright to avoid detection Keep on reading Headless BrowsersSome anti bot systems will only show the content after the browser solves a JavaScript challenge And we can t use Python Requests to simulate browser behavior like that We need a browser with JavaScript execution to run and pass the challenge Selenium Puppeteer and Playwright are the most used and known libraries Avoiding them for performance reasons would be preferable and they ll make scraping slower But sometimes there is no alternative We ll see how to run Playwright The snippet below shows a simple script visiting a page that prints the sent headers The output only shows the User Agent but since it s a real browser the headers will include the entire set Accept Accept Encoding etc import json from playwright sync api import sync playwright with sync playwright as p for browser type in p chromium p firefox p webkit browser browser type launch page browser new page page goto jsonContent json loads page inner text pre print jsonContent headers User Agent browser close Mozilla X Linux x AppleWebKit KHTML like Gecko HeadlessChrome Safari Mozilla X Linux x rv Gecko Firefox Mozilla Macintosh Intel Mac OS X AppleWebKit KHTML like Gecko Version Safari This approach comes with its own problem take a look at the User Agents The Chromium one includes Headless Chrome which will tell the target website that it s a headless browser They might act upon that Back to the headers section we can add custom headers that will overwrite the default ones Replace the line in the previous snippet with this one and paste a valid User Agent browser new page extra http headers User Agent That is just entry level with headless browsers Headless detection is a field in itself and many people are working on it Some to detect it some to avoid being blocked As an example you can visit Pixelscan with an actual browser and a headless one To be deemed consistent you ll need to work hard Look at the screenshot below taken when visiting Pixelscan with Playwright See the UA The one we fake is all right but they can detect that we re lying by checking the navigator JavaScript API We can pass user agent and Playwright will set the User Agent in JavaScript and the header for us Nice page browser new page user agent You can easily add Playwright stealth to your scripts for more advanced cases and make detection harder It handles inconsistencies between headers and browser JavaScript APIs among other things In summary having coverage is complex but you won t need it most of the time Sites can always do some more complex checks WebGL touch events or battery status You won t need those extra features unless you are trying to scrape a website that requires bypassing an anti bot solution like Akamai And for those cases that extra effort will be mandatory And demanding to be honest Geographic Limits or Geo BlockingHave you ever tried to watch CNN from outside the US That s called geo blocking Only connections from inside the US can watch CNN live To bypass that we could use a Virtual Private Network VPN We can then browse as usual but the website will see a local IP thanks to the VPN The same can happen when scraping websites with geo blocking There is an equivalent for proxies geolocated proxies Some proxy providers allow us to choose from a list of countries With that activated we ll only get local IPs from the US for example Behavioral PatternsBlocking IPs and User Agents isn t enough these days They become unmanageable and stale in hours if not minutes As long as we perform requests with clean IPs and real world User Agents we are mainly safe There are more factors involved but most requests should be valid However most modern anti bot software use machine learning and behavioral patterns not just static markers IP UA geolocation That means we would be detected if we always performed the same actions in the same order Go to the homepage Click on the Shop button Scroll down Go to page After a few days launching the same script could result in every request being blocked Many people can perform those same actions but bots have something that makes them obvious speed  With software we would execute every step sequentially while an actual user would take a second then click scroll down slowly using the mouse wheel move the mouse to the link and click Maybe there is no need to fake all that but be aware of the possible problems and know how to face them We have to think about what we want Maybe we don t need that first request since we only require the second page We could use that as an entry point not the homepage And save one request It can scale to hundreds of URLs per domain No need to visit every page in order scroll down click on the next page and start again To scrape search results once we recognize the URL pattern for pagination we only need two data points the number of items and items per page And most of the time that info is present on the first page or request import requests from bs import BeautifulSoup response requests get soup BeautifulSoup response content html parser pages soup select woocommerce pagination a page numbers not next print pages get href print pages get href One request shows us that there are pages We can now queue them Mixing with the other techniques we would scrape the content from this page and add the remaining To scrape them by bypassing anti bot systems we could Shuffle the page order to avoid pattern detection Use different IPs and User Agents so each request looks like a new one Add delays between some of the calls Use Google as a referrer randomly We could write some snippets mixing all these but the best option in real life is to use a tool with it all like Scrapy pyspider node crawler Node js or Colly Go The idea being the snippets is to understand each problem on its own But for large scale real life projects handling everything on our own would be too complicated CAPTCHAEven the best prepared request can get caught and shown a CAPTCHA Nowadays solving CAPTCHAS is achievable with solutions like Anti Captcha and Captcha but a waste of time and money The best solution is to avoid them The second best is to forget about that request and retry The exception is obvious sites that always show a CAPTCHA on the first visit We have to solve it if there is no way to bypass it And then use the session cookies to avoid being challenged again It might sound counterintuitive but waiting for a second and retrying the same request with a different IP and set of headers will be faster than solving a CAPTCHA Try it yourself and tell us about the experience Be a Good Internet CitizenWe can use several websites for testing but be careful when doing the same at scale Try to be a good internet citizen and don t cause DDoS Limit your interactions per domain Amazon can handle thousands of requests per second but not all target sites will We re always talking about read only browsing mode Access a page and read its contents Never submit a form or perform active actions with malicious intent If we take a more active approach several other factors would matter writing speed mouse movement navigation without clicking browsing many pages simultaneously etc Bot prevention software is specifically aggressive with active actions As it should for security reasons We won t discuss this part but these actions will give them new reasons to block requests Again good citizens don t try massive logins We re talking about scraping not malicious activities Sometimes websites make data collection harder maybe not on purpose But with modern frontend tools CSS classes could change daily ruining thoroughly prepared scripts For more details read our previous entry on how to scrape data in Python ConclusionWe d like you to remember the low hanging fruits IP rotating proxies Residential proxies for challenging targets Full set headers including User Agent Bypass bot detection with Playwright when JavaScript challenge is required maybe adding the stealth module Avoid patterns that might tag you as a bot There are many more and probably more we didn t cover But with these techniques you should be able to crawl and scrape at scale After all web scraping without getting blocked with Python is possible if you know how Contact us if you know more website scraping tricks or have doubts about applying them Remember we covered scraping and avoiding being blocked but there is much more crawling converting and storing the content scaling the infrastructure and more Stay tuned Do not forget to take a look at the rest of the posts in this series From Zero to Hero Avoid Detection Like a Ninja Crawling from Scratch Scaling to Distributed Crawling Did you find the content helpful Spread the word and share it on Twitter  LinkedIn or Facebook This article was originally published on ZenRows  Web Scraping in Python Avoid Detection Like a Ninja 2023-04-05 10:47:25
海外TECH DEV Community Javascript Object #4 https://dev.to/samr/javascript-object-4-26n7 Javascript Object In the last Chapter we ve seen a piece of closure called Lexical Enviroment Let put that in hold for now and get back into our Object series in this Blog In the Begginning Chapter we learnt what is object and how to create it In the Past Chapter we ve created the object using Object literal syntax Now we are going to create an Object using the CONSTRUCTOR Syntax CONSTRUCTOR OBJECTBefore creating the Object let know about the use cases of Object created using Constructor Syntax let person firstName Rachael lastName Ross as you can see this is the basic method of creating an Object using literal Syntax What If we wan t to create same Object multiple times but with Different values here comes the Constructor syntax into use The basic Syntax is function Name firstName lastName this firstName firstName this lastName lastName This is the basic syntax of the Construtor Function These are as like Normal Function but the rules to call a constructor functions was The name of a constructor function starts with a capital letter like Person Document etc A constructor function should be called only with the new operator Now let s create an Object with our Constructor Function In order to create a new Object we need to call our Function New keyword to create a new instance of an Object For Example let friends new Name Chandler Monica That s it very quite right Then what s go behind this while Creating the new Objects Create a new empty object and assign it to the this variable Assign the arguments Chandler and Monica to the firstName and lastName properties of the object Return the this value So this might be equivalent to the function Name firstName lastName this add properties to thisthis firstName firstName this lastName lastName return this Pretty Straigt We can also add method to the Constructor function let s create a method to get the fullname of an Object as follows function Name firstName lastName this firstName firstName this lastName lastName this getFullName function return this firstName this lastName Now we ve added a method to our function Let s call it out let friends new Name Ross Rachael console log person getFullName Output Ross Rachael Simple but not efficient because whenever you create a new instance for every Object getFullName will be duplicated and it is not memory efficient Inorder to overcome this we can create this method in the Object Prototype so that every instance of that type can use the method what if we use return inside the constructor function Returning from constructor functionsTypically a constructor function implicitly returns this that set to the newly created object But if it has a return statement then here s are the rules If return is called with an object the constructor function returns that object instead of this If return is called with a value other than an object it is ignored That s all about Constructor Function now we ll see more about this in upcoming post Many Thanks for Your Time Sam 2023-04-05 10:31:56
海外TECH DEV Community Adobe launches all in one Generative AI tool Firefly https://dev.to/amananandrai/adobe-launches-all-in-one-generative-ai-tool-firefly-58a3 Adobe launches all in one Generative AI tool FireflyAdobe the tech giant in the field of digital graphics and images that has launched tools like Photoshop Illustrator After effects etc has launched a new Generative Ai tool Firefly Firefly was launched on st March It is a browser based generative AI tool It is a tool for illustrators concept artists and graphic designers It is an all in one tool which has various features for helping users It has the following features Text to image Inpainting Text to Template Text Effects Recolor Vectors Text to vector Upscaling and Image Extension Personalized Results d to Image Text to Pattern Text to Brush Sketch to imageThe tool is still in Beta and only two features Text to image and Text effects are available for users on a waitlist basis You can watch the trailer of the tool here For futher details visit the site 2023-04-05 10:16:24
海外TECH DEV Community Mastering Django: Best Practices for Writing High-Quality Web Applications https://dev.to/pratyushcode/mastering-django-best-practices-for-writing-high-quality-web-applications-2mf8 Mastering Django Best Practices for Writing High Quality Web ApplicationsDjango is a popular web framework for building high quality web applications It provides a robust set of tools and features that enable developers to build web applications quickly and efficiently However building a high quality web application requires more than just knowing the basics of Django To build a robust scalable and maintainable application developers need to follow best practices and use Django s features correctly In this blog post we ll explore four essential practices that will help you keep your project organized secure and production ready We ll cover how to create and update a gitignore file how to manage sensitive information using env files how to use the Django settings package to manage project settings and finally we ll share some standard practices to keep your project production ready Let s get started NOTE To highlight these design and configuration changes I have authored a repo pratyzsh django best practices Please feel free to raise a pull request and support the project pratyzsh django best practices Best practices every Django developer should follow from start to finish Django Best PracticesThis repository provides a collection of best practices for developing Django web applications These practices are based on industry standards and can help you write more robust maintainable and scalable Django code Table of ContentsGetting StartedProject StructureEnvironment ConfigurationPostgreSQL Database ConnectionSecurityTestingDeploymentContributingLicenseGetting StartedFirst clone the repository from Github and switch to the new directory git clone cd django best practicesActivate the virtualenv for your project source django env bin activateInstall project dependencies pip install r requirements txtThen simply apply the migrations python manage py migrateYou can now run the development server python manage py runserverProject StructureA well organized project structure is key to writing maintainable code This section provides recommendations for organizing your Django project Environment ConfigurationManaging environment variables is important for keeping your application secure and flexible This section covers how to… View on GitHub Create and update gitignore fileWe need gitignore files to exclude certain files and directories from being tracked by Git This helps to keep our repositories clean and avoids cluttering them with unnecessary files Additionally it improves repository performance and security by preventing sensitive or large files from being accidentally committed Some common files you should always aim to exclude from commit history using gitignore Virtual Environment folder env filesstatic folderDatabase files in Django we have db sqlite database as default which needs to be excludedFor Django you can use this base template and extend it based on your project requirements Add env files to manage sensitive information env files are used to securely store sensitive information such as passwords and API keys They keep this information separate from source code and configuration files Additionally they allow for easy switching between different environments without modifying code In case of Django we have a bunch of sensitive information to deal with like SECRET KEY DATABASE NAME DATABASE PASS and so on Here s an example how we can do this in Django Create env file in the same directory as your settings py fileThe env file should look like this DATABASE NAME postgresDATABASE USER adam After storing the secrets in the above file we can configure the settings py file like this import environenv environ Env reading env fileenviron Env read env SECRET KEY env SECRET KEY DATABASES default ENGINE django db backends postgresql psycopg NAME env DATABASE NAME USER env DATABASE USER PASSWORD env DATABASE PASSWORD HOST env DATABASE HOST PORT env DATABASE PORT Refer to the repository to understand the implementation details Create different settings py for different deployment environmentsWhen you re developing a Django project you ll typically have different settings depending on the environment you re working in such as development testing staging and production Few key reasons Security Production environments have different security requirements than development or testing environments Using different settings files ensures that security measures like HTTPS and limited database permissions are properly implemented Performance Production environments have different performance requirements than development or testing environments Using different settings files ensures that performance measures like caching and load balancing are properly implemented Debugging Debugging in production is different than debugging in development or testing Different logging or error reporting tools might be needed which can be specified in different settings files Customisation Different environments might have different requirements or configurations that need to be customised such as email or SMS providers Using different settings files ensures that these Customisation are properly implemented for each environment To manage this there are multiple approaches that you can take I learnt a lot from this article and the approach I borrowed is creating a separate settings package and change the reference in manage py file The final project structure should look like this sampleproject settings py default settings file settings package with base py as the new default settings file init py base py development py production pyOnce you have created a separate folder and added your separate setting files make changes in the manage py file def main Run administrative tasks os environ setdefault DJANGO SETTINGS MODULE settings base reference changed to settings base py in the above line Once you have made these changes you can independently manage all environments by changing their respecting settings In order to run the application using development py in the example mentioned above either we can change the manage py based on our project requirement but you also run this command to extend settings dynamically python manage py runserver settings settings developmentThis is a fast way to toggle between environments before initiating the deployment process Refer to the repository to understand the implementation details As a final note one thing I have personally found to be very useful in ensuring these best practices are followed is to ensure proper naming formatting and linting of my code You can check out my other blog Streamlining Your Python Workflow with Black Ruff GitHub Actions and Pre Commit Hooks which is also used in the repository Do check out the blog and repository Like and share if you find this valuable Implement these and experience a blazing fast workflow 2023-04-05 10:10:29
海外TECH DEV Community ChatGPT Arrives to Azure OpenAI Service!🤖 https://dev.to/bytehide/chatgpt-arrives-to-azure-openai-service-eja ChatGPT Arrives to Azure OpenAI Service Are you ready for the next big thing in artificial intelligence Today Microsoft announced the preview of ChatGPT on Azure OpenAI Service making it easier than ever for developers to integrate AI based experiences into their applications Imagine enhancing chatbots to answer unexpected questions speeding up customer support resolutions or even automating claims processing Sounds amazing right A Game Changer for BusinessesBusinesses of all sizes are discovering the real value of Azure OpenAI Service Take The ODP Corporation for example Carl Brisco their VP of products and technology says “We are delighted to leverage ChatGPT s effective artificial intelligence technology in Azure OpenAI Service…This technology will help The ODP Corporation drive continuous transformation in our business more effectively explore new possibilities and design innovative solutions to deliver even more value to our customers and partners source techcommunity microsoft comThe ODP Corporation is already using ChatGPT to create a chatbot for their HR department They ve seen significant improvements in document review processes job description generation and communication with associates It s clear that ChatGPT s natural language processing and machine learning are revolutionizing the way businesses operate A Smart Nation Embraces AISingapore s Digital Government and Smart Nation Group is another great example Feng ji Sim Deputy Secretary shares “ChatGPT and extended language models in general hold the promise of accelerating many types of knowledge work in the public sector…Azure OpenAI Service s enterprise controls have been instrumental in enabling the exploration of these technologies in policy operations and communication related use cases Azure OpenAI Service empowers public servants with advanced AI tools helping them deliver better services and ideas for Singapore Revolutionizing Contract IntelligenceIcertis a company specializing in contract intelligence is also harnessing ChatGPT s power Monish Darda their CTO explains “The availability of ChatGPT on Microsoft Azure OpenAI Service provides an effective tool to achieve these outcomes when leveraged with our data lake of over billion metadata and transactional elements…Generative artificial intelligence will help companies fully understand the intent of their commercial agreements acting as an intelligent assistant that exposes and reveals information throughout the contract lifecycle With AI backed security and reliability ChatGPT is set to create new opportunities for innovation in contract intelligence Microsoft s AI Transformation JourneyMicrosoft itself is using Azure OpenAI Service to incorporate new experiences into their consumer and enterprise products Some exciting examples include GitHub Copilot AI driven programming tool to help developers accelerate code generation Microsoft Teams Premium Intelligent summaries and AI generated chapters for better productivity Microsoft Viva Sales AI powered salesperson experience with suggested email content and data driven insights Microsoft Bing AI powered chat option to revolutionize the consumer search experience A Responsible Approach to AIMicrosoft is committed to ensuring that AI systems are developed responsibly and used in ways that people can trust By adopting a principles based approach and learning from users feedback Microsoft aims to shape AI s future in a way that ultimately benefits humanity source techcommunity microsoft com What will happen in the future As we have seen ChatGPT on Azure OpenAI Service has the potential to transform the way businesses operate making them more efficient and effective in their processes It also empowers public servants with advanced AI tools helping them deliver better services to citizens Furthermore companies specializing in contract intelligence can benefit greatly from ChatGPT s generative artificial intelligence capabilities But it s not just businesses and governments that can benefit from ChatGPT on Azure OpenAI Service Microsoft itself is leveraging this technology to create new experiences for its users From an AI driven programming tool to an AI powered salesperson experience Microsoft is at the forefront of AI innovation all while ensuring that these technologies are developed and used responsibly At the heart of Microsoft s AI transformation journey is its commitment to responsible AI development By adopting a principles based approach and listening to user feedback Microsoft aims to shape the future of AI in a way that ultimately benefits humanity If you re interested in trying out ChatGPT on Azure OpenAI Service for yourself now is the perfect time With its natural language processing and machine learning capabilities ChatGPT can enhance chatbots speed up customer support resolutions and even automate claims processing It s a game changer for businesses of all sizes and it s all possible thanks to Azure OpenAI Service 2023-04-05 10:07:24
Apple AppleInsider - Frontpage News Veci 2-in-1 MagSafe Wallet review: Best MagSafe Wallet that isn't Apple's https://appleinsider.com/articles/23/04/05/veci-2-in-1-magsafe-wallet-review-best-magsafe-wallet-that-isnt-apples?utm_medium=rss Veci in MagSafe Wallet review Best MagSafe Wallet that isn x t Apple x sIf you re tired of having to choose between sleek MagSafe wallet minimalism and a bulky bifold that can carry everything ーget one that can do both with Veci Minimalist MagSafe Wallet and Bifold in oneIt s easy to say we re a minimalist wallet kind of people Two or three card slots at most plus MagSafe and we re sold Read more 2023-04-05 10:52:33
Apple AppleInsider - Frontpage News Apple gives a first look at its new flagship store in Mumbai, India https://appleinsider.com/articles/23/04/05/apple-gives-a-first-look-at-its-new-flagship-store-in-mumbai-india?utm_medium=rss Apple gives a first look at its new flagship store in Mumbai IndiaThe striking artwork design for Apple BKC the official name for Apple s first Mumbai store has been revealed as the company works to expand its presence in India Apple BKC in MumbaiAfter years of rumors over where in India it would open a store Apple is continuing to tease a forthcoming opening in Mumbai While still not officially revealing a date beyond coming soon it has now released an image to local media Read more 2023-04-05 10:38:03
Apple AppleInsider - Frontpage News Apple Service ending for some older OS versions -- but it's not a big deal https://appleinsider.com/articles/23/04/05/apple-service-support-to-end-for-ios-11-era-software-in-may?utm_medium=rss Apple Service ending for some older OS versions but it x s not a big dealA reliable source says internal documents reference the end of Apple Services support except iCloud for older versions of iOS macOS High Sierra and others starting in May but it isn t going to be a problem for most users Here s why iCloud will still work on iOS and related releasesAccording to an accurate leaker named Fudge or StellaFudge on Twitter shared that select operating systems that first debiuted would not be able to access Apple services with the exception of iCloud starting in early May Those still running these operating systems will likely be prompted to update Read more 2023-04-05 10:47:48
海外TECH Engadget ‘Call of Duty’ can detect and ban XIM-style cheat hardware https://www.engadget.com/call-of-duty-can-detect-and-ban-xim-style-cheat-hardware-100314416.html?src=rss Call of Duty can detect and ban XIM style cheat hardwareActivision s Call of Duty Ricochet anti cheat team has introduced a number of new measures designed to reduce unfair play Those include a replay investigation tool along with detection of third party XIM type devices Some cheaters will be permanently banned but CoD has revealed that others will be subject to some new and rather hilarious in game mitigations nbsp To start with it has deployed a system designed to detect third party hardware cheat devices like XIM Cronus Zen and ReaSnow S These devices act as a passthrough for controllers on PC and console and when used improperly or maliciously can provide a player with the ability to gain an unfair gameplay advantage such as reducing or eliminating recoil the team noted in a blog post At first Ricochet will give players an unsupported device warning as shown above but continued use could result in measures ranging from mitigations up to permabans across all Call of Duty titles nbsp It s also using a new replay investigation tool Using captured and stored match gameplay data our teams can load up and watch any completed match as part of our investigation process the team wrote It ll focus on ranked play in both Modern Warefare II and Warzone capturing and storing all match data for signs of suspicious activity The system has already aided in investigations that resulted in permanent bans Activision revealed more about mitigations as well It has already talked about Damage Shield which allows innocent players to take fire without being injured and has now detailed the Disarm and Cloak measures As shown in the Disarm demo above after trying to switch weapons a player ends up facing their opponent with no weapon at all Cloak as you d imagine turns enemies invisible nbsp Ubisoft recently launched its own crackdown that allows players to continue albeit with significant handicaps until they unplug cheat devices Epic Games also recently pulled out the perma banhammer for cheaters nbsp Last year Activision said its anti cheat measures implemented in had led to a significant drop in cheaters However it added that it expects players to create new ways to get around existing measures We know tomorrow will continue to deliver new and evolving threats team Ricochet wrote This article originally appeared on Engadget at 2023-04-05 10:03:14
海外TECH Engadget Best Buy’s new recycling program will let you mail in your old electronics https://www.engadget.com/best-buys-new-recycling-program-will-let-you-mail-in-your-old-electronics-100030769.html?src=rss Best Buy s new recycling program will let you mail in your old electronicsBest Buy announced today that it s extending its gadget recycling program to include a new mail in option The retailer will now sell you a box for your used electronics that you can ship back for recycling saving a trip to the store Best Buy says it s recycled billion pounds of electronics and appliances through its existing programs describing itself as the US “largest retail collector of e waste The program lets you order a box in one of two sizes a small x x inches one for e waste weighing up to six lbs and a medium x x inches one supporting up to lbs After receiving it you can pack in as many approved devices as you can fit as long as they stay under the weight limits Then you can either take them to a UPS drop off point or schedule a UPS pickup The program is an extension of Best Buy s free in store recycling program launched in The retailer also provides a home pickup option but it costs and is ideal for unusually cumbersome items like home theater and heavy appliances All of its recycling initiatives accept computers tablets TVs smartphones radios appliances cameras and other common gadgets You can read the complete list and exclusions here The free in store recycling program would be more practical unless you live far from a Best Buy location Still I can see some people willing to pay to avoid making the trip ーespecially during the holiday shopping season or if you have disposable income and live in a congested area If nothing else Earth Day April nd is an appropriate time to raise awareness of e waste recycling to nudge people away from throwing these items in the trash where they ll get hauled off to landfills This article originally appeared on Engadget at 2023-04-05 10:00:30
ラズパイ Raspberry Pi Test our new Code Editor for young people https://www.raspberrypi.org/blog/code-editor-beta-testing/ Test our new Code Editor for young peopleWe are building a new online text based Code Editor to help young people aged and older learn to write code It s free and designed for young people who attend Code Clubs and CoderDojos students in schools and learners at home At this stage of development the Code Editor enables learners to We ve chosen Python The post Test our new Code Editor for young people appeared first on Raspberry Pi Foundation 2023-04-05 10:55:30
海外ニュース Japan Times latest articles Everyone wants to make Ryan Reynolds money https://www.japantimes.co.jp/opinion/2023/04/05/commentary/world-commentary/big-celebrity-paydays/ everyone 2023-04-05 19:12:00
海外ニュース Japan Times latest articles The China-U.S. relationship must be fixed https://www.japantimes.co.jp/opinion/2023/04/05/commentary/world-commentary/china-us-relations-2/ bilateral 2023-04-05 19:10:53
海外ニュース Japan Times latest articles Just how dangerous are India’s generic drugs? Very. https://www.japantimes.co.jp/opinion/2023/04/05/commentary/world-commentary/indian-drug-safety/ Just how dangerous are India s generic drugs Very The red flags for Indian drugs have been there for years The world deserves much better than contaminated medicine and children poisoned by cough syrup 2023-04-05 19:10:00
海外ニュース Japan Times latest articles A world with Putin and Trump in the dock https://www.japantimes.co.jp/opinion/2023/04/05/commentary/world-commentary/putin-trump-trials/ A world with Putin and Trump in the dockThe recent indictments of Russian leader Vladimir Putin and former U S President Donald Trump highlight the law s growing and potentially dangerous dominion in politics 2023-04-05 19:08:38
海外ニュース Japan Times latest articles Is ‘Americanness’ in peril? https://www.japantimes.co.jp/opinion/2023/04/05/commentary/japan-commentary/u-s-society-being-tested/ mediocre 2023-04-05 19:06:54
ニュース BBC News - Home Nicola Sturgeon's husband Peter Murrell arrested in SNP finance probe https://www.bbc.co.uk/news/uk-scotland-65187823?at_medium=RSS&at_campaign=KARANGA headquarters 2023-04-05 10:51:49
ニュース BBC News - Home Judge says parents and children should receive infected blood payments https://www.bbc.co.uk/news/health-65179522?at_medium=RSS&at_campaign=KARANGA compensation 2023-04-05 10:43:52
ニュース BBC News - Home Ramadan and Passover raise tensions at Jerusalem holy site https://www.bbc.co.uk/news/world-middle-east-65187651?at_medium=RSS&at_campaign=KARANGA compound 2023-04-05 10:46:15
ニュース BBC News - Home President Biden confirms visit to Northern Ireland https://www.bbc.co.uk/news/uk-northern-ireland-65170050?at_medium=RSS&at_campaign=KARANGA dublin 2023-04-05 10:38:04
ニュース BBC News - Home European Super League 'like wolf in Little Red Riding Hood' https://www.bbc.co.uk/sport/football/65178168?at_medium=RSS&at_campaign=KARANGA European Super League x like wolf in Little Red Riding Hood x Uefa president Aleksander Ceferin criticises supporters of the European Super League comparing them to the wolf in Little Red Riding Hood 2023-04-05 10:34:14
GCP Google Cloud Platform Japan 公式ブログ Looker Modeler のご紹介: BI 指標のための信頼できる唯一の情報源 https://cloud.google.com/blog/ja/products/data-analytics/introducing-looker-modeler/ この新しい機能は、LookerをよりオープンなBIプラットフォームにするというGoogleのビジョンを実現するための次の一歩であり、あらゆるユーザーに使い慣れたツールを通して信頼できるデータをお届けするものです。 2023-04-05 11:50:00
ニュース Newsweek プーチンは体の病気ではなく心の病気?──元警護官が明かす奇行の数々 https://www.newsweekjapan.jp/stories/world/2023/04/post-101307.php 2023-04-05 19:39:42
ニュース Newsweek 「NFTアート」を購入しカーボンオフセットに貢献? 米スタートアップ企業が展開 https://www.newsweekjapan.jp/stories/sdgs/2023/04/nft-5.php しかし、最近の調査では、エコや環境問題に「関心がある」と答えた人はで、具体的な行動はしていない「やや関心がある」人は、ノーエコ派はという結果もあり、多くの人が日常的にエコ行動をしている状況とは言い難い。 2023-04-05 19:05:38
IT 週刊アスキー 106タイトルが最大80%オフ!Steamで「THQ Nordic 春のセール第二弾」が開催中 https://weekly.ascii.jp/elem/000/004/131/4131678/ steam 2023-04-05 19:35:00
IT 週刊アスキー アステリア、サブスク型のデータ連携サービス 「TASUKE for 奉行クラウド」提供開始 https://weekly.ascii.jp/elem/000/004/131/4131677/ tasukefor 2023-04-05 19:30:00
IT 週刊アスキー 「ロリポップ!レンタルサーバー byGMOペパボ」と「GMOレンシュ」、ChatGPTを活用した新機能を提供開始 https://weekly.ascii.jp/elem/000/004/131/4131664/ chatgpt 2023-04-05 19:10:00
GCP Cloud Blog JA Looker Modeler のご紹介: BI 指標のための信頼できる唯一の情報源 https://cloud.google.com/blog/ja/products/data-analytics/introducing-looker-modeler/ この新しい機能は、LookerをよりオープンなBIプラットフォームにするというGoogleのビジョンを実現するための次の一歩であり、あらゆるユーザーに使い慣れたツールを通して信頼できるデータをお届けするものです。 2023-04-05 11:50:00

コメント

このブログの人気の投稿

投稿時間:2021-06-17 05:05:34 RSSフィード2021-06-17 05:00 分まとめ(1274件)

投稿時間:2021-06-20 02:06:12 RSSフィード2021-06-20 02:00 分まとめ(3871件)

投稿時間:2020-12-01 09:41:49 RSSフィード2020-12-01 09:00 分まとめ(69件)