投稿時間:2023-02-08 00:26:27 RSSフィード2023-02-08 00:00 分まとめ(30件)

カテゴリー等 サイト名等 記事タイトル・トレンドワード等 リンクURL 頻出ワード・要約等/検索ボリューム 登録日
IT 気になる、記になる… 「HomePod (第2世代)」の分解動画 ー 部品の交換が初代よりし易い設計に https://taisy0.com/2023/02/07/168208.html apple 2023-02-07 14:18:58
AWS AWS Government, Education, and Nonprofits Blog How Gaggle’s ReachOut program uses AWS to ease the K12 mental health crisis https://aws.amazon.com/blogs/publicsector/gaggles-reachout-program-uses-aws-ease-k12-mental-health-crisis/ How Gaggle s ReachOut program uses AWS to ease the K mental health crisisUS schools are doing their best to support students but with staffing shortages and additional challenges exacerbated by the widespread disruption in teaching and learning the education community needs solutions to help with recovery Students also need help available to them when they need it and that may not fall within the hours of the school day AWS Partner Gaggle is an education technology EdTech company that delivers solutions that focus on helping to keep students safe Gaggle developed ReachOut a mental health hotline built on AWS that connects K students to trained Gaggle support counselors anywhere anytime 2023-02-07 14:40:14
AWS AWS AWS Supports You | Diving Deep into AWS Systems Manager https://www.youtube.com/watch?v=xHNLNTa2xGU AWS Supports You Diving Deep into AWS Systems ManagerWe would love to hear your feedback about our show Please take our survey here AWS Supports You Diving Deep into AWS Systems Manager gives viewers on our twitch tv aws channel an overview of how to manage AWS instances as scale using AWS Systems Manager Session Manager and Patch Manager including a live demo This episode originally aired on February th Introduction AWS Systems Manager Customer Testimonials AWS Systems Manager Session Manager AWS Systems Manager Patch Manager Demo Conclusions Helpful Links Ask our experts over on AWS Systems Manager AWS Systems Manager Session Manager AWS Systems Manager Patch Manager Jump to for some QR codes to our online workshop resources Subscribe More AWS videos More AWS events videos ABOUT AWSAmazon Web Services AWS is the world s most comprehensive and broadly adopted cloud platform offering over fully featured services from data centers globally Millions of customers ーincluding the fastest growing startups largest enterprises and leading government agencies ーare using AWS to lower costs become more agile and innovate faster AWS AmazonWebServices CloudComputing 2023-02-07 14:40:30
js JavaScriptタグが付けられた新着投稿 - Qiita CSS-Tricks の記事で見かけた「マウスカーソルの軌跡上でパーティクルを発生させる処理」を p5.js で置きかえる(一部は簡略化) https://qiita.com/youtoy/items/190e6049172a22d9e592 csstricks 2023-02-07 23:32:09
AWS AWSタグが付けられた新着投稿 - Qiita 3大クラウドベンダー試験まとめ https://qiita.com/oko1977/items/98f65f504a5ef28f972d 自己研鑽 2023-02-07 23:09:53
GCP gcpタグが付けられた新着投稿 - Qiita 3大クラウドベンダー試験まとめ https://qiita.com/oko1977/items/98f65f504a5ef28f972d 自己研鑽 2023-02-07 23:09:53
Azure Azureタグが付けられた新着投稿 - Qiita 3大クラウドベンダー試験まとめ https://qiita.com/oko1977/items/98f65f504a5ef28f972d 自己研鑽 2023-02-07 23:09:53
Git Gitタグが付けられた新着投稿 - Qiita 【Git】submoduleの追加と削除 https://qiita.com/P-man_Brown/items/7a5da8aa5a2d8ae72ab8 gitsubmodul 2023-02-07 23:45:11
技術ブログ Developers.IO Matterport SDK for Embeds packageのチュートリアルをやってみた https://dev.classmethod.jp/articles/the-matterport-sdk-for-embeds-package-tutorial/ delivery 2023-02-07 14:54:53
海外TECH MakeUseOf EarFun Air Pro 3 Review: Good Sound and ANC for Under $100 https://www.makeuseof.com/earfun-air-pro-3-review/ bucks 2023-02-07 14:05:15
海外TECH DEV Community How To Generate Test Data for Your Database Project With Python https://dev.to/mattdark/how-to-generate-test-data-for-your-database-project-with-python-3nkf How To Generate Test Data for Your Database Project With PythonIf you need test data for the database of your project you can get a dataset from Kaggle or use a data generator In the first case if you need to process the data before inserting it into the database you can use Pandas a widely used Python library for data analysis This library supports different formats including CSV and JSON and it also provides a method for inserting data into a SQL database If you choose a data generator instead you can find one for MySQL in one of the repositories on our Percona Lab GitHub account Are you using other database technologies You can follow the guides I already published where I explain how to create your own data generator for MySQL it could work for PostgreSQL and MongoDB If you create you re own data generator this is the process you must follow Generate fake data using FakerStore generated data in a Pandas DataFrameEstablish a connection to your databaseInsert the content of the DataFrame into the database Requirements DependenciesMake sure all the dependencies are installed before creating the Python script that will generate the data for your project You can create a requirements txt file with the following content pandastqdmfakerOr if you re using Anaconda create an environment yml file name perconadependencies python pandas tqdm fakerYou can change the Python version as this script has been proven to work with these versions of Python and Depending on the database technology you re using you must add the corresponding package to your requirements txt or environment yml file MySQL →PyMySQLPostgreSQL →psycopgMongoDB →pymongoRun the following command if you re using pip pip install r requirements txtOr run the following statement to configure the project environment when using Anaconda conda env create f environment yml DatabaseNow that you have the dependencies installed you must create a database named company for MySQL or PostgreSQL Log into MySQL mysql u root pReplace root with your username if necessary and replace localhost with the IP address or URL of your MySQL server instance if needed Or log into PostgreSQL sudo su postgres psqland create the company database create database company You don t need to create the MongoDB database previously Creating a Pandas DataFrameBefore creating the script it s important to know that we need to implement multiprocessing for optimizing the execution time of the script Multiprocessing is a way to take advantage of the CPU cores available in the computer where the script is running In Python single CPU use is caused by the global interpreter lock which allows only one thread to carry the Python interpreter at any given time With multiprocessing all the workload is divided into every CPU core available For more information see this blog post Now let s start creating our own data generator First a modules directory needs to be created and inside the directory we will create a module named dataframe py This module will be imported later into our main script and this is where we define the method that will generate the data You need to import the required libraries and methods from multiprocessing import cpu countimport pandas as pdfrom tqdm import tqdmfrom faker import Fakerpandas Data generated with Faker will be stored in a Pandas DataFrame before being imported into the database tqdm This method is required for adding a progress bar to show the progress of the DataFrame creation Faker It s the generator from the faker library cpu count This is a method from the multiprocessing module that will return the number of cores available Then a faker generator will be created and initialized by calling the Faker method This is required to generate data by accessing the properties in the Faker library And we determine the number of cores of the CPU available by calling the cpu count method and assigning this value to the num cores variable fake Faker num cores cpu count num cores is a variable that stores the value returned after calling the cpu count method We use all the cores minus one to avoid freezing the computer def create dataframe arg x int num cores data pd DataFrame for i in tqdm range x desc Creating DataFrame data loc i first name fake first name data loc i last name fake last name data loc i job fake job data loc i company fake company data loc i address fake address data loc i city fake city data loc i country fake country data loc i email fake email return dataThen we define the create dataframe function where x is the variable that will determine the number of iterations of the for loop where the DataFrame is created data is an empty DataFrame that will later be fulfilled with data generated with Faker Pandas DataFrame loc attribute provides access to a group of rows and columns by their label s In each iteration a row of data is added to the DataFrame and this attribute allows assigning values to each column The DataFrame that is created after calling this function will have the following columns Column Non Null Count Dtype first name non null object last name non null object job non null object company non null object address non null object country non null object city non null object email non null objectNote The script is generating thousand records but it can be adapted to your project you can modify this value in the x variable Connection to the Database MySQL and PostgreSQLBefore inserting the data previously generated with Faker we need to establish a connection to the database and for doing this the SQLAlchemy library will be used SQLAlchemy is the Python SQL toolkit and Object Relational Mapper that gives application developers the full power and flexibility of SQL from sqlalchemy import create enginefrom sqlalchemy orm import sessionmakerengine create engine mysql pymysql user password localhost company Session sessionmaker bind engine From SQLAlchemy we import the create engine and the sessionmaker methods The first one is for connecting to the database and the second one is for creating a session bond to the engine object Don t forget to replace user password and localhost with your authentication details Save this code in the modules directory and name it as base py For PostgreSQL replace engine create engine mysql pymysql user password localhost company With engine create engine postgresql psycopg user password localhost company Database Schema DefinitionFor MySQL and PostgreSQL the schema of the database can be defined through the Schema Definition Language provided by SQLAlchemy but as we re only creating one table and importing the DataFrame by calling Pandas to sql method this is not necessary When calling Pandas to sql method we define the schema as follows from sqlalchemy types import schema first name String last name String job String company String address String city String country String email String Then we pass the schema variable as a parameter to this method Save this code in the modules directory with the name schema py MongoDBBefore inserting the data previously generated with Faker we need to establish a connection to the database and for doing this the PyMongo library will be used from pymongo import MongoClienturi mongodb user password localhost client MongoClient uri From PyMongo we import the MongoClient method Don t forget to replace user password localhost and port with your authentication details Save this code in the modules directory and name it base py Generating Your Data MySQL and PostgreSQLAll the required modules are now ready to be imported into the main script now it s time to create the sql py script First import the required libraries from multiprocessing import Poolfrom multiprocessing import cpu countimport pandas as pdFrom multiprocessing Pool and cpu count are required The Python Multiprocessing Pool class allows you to create and manage process pools in Python Then import the modules previously created from modules dataframe import create dataframefrom modules schema import schemafrom modules base import Session engineNow we create the multiprocessing pool configured to use all available CPU cores minus one Each core will call the create dataframe function and create a DataFrame with thousand records After each call to the function has finished all the DataFrames created will be concatenated into a single one if name main num cores cpu count with Pool as pool data pd concat pool map create dataframe range num cores data to sql name employees con engine if exists append index False dtype schema And finally we will insert the DataFrame into the MySQL database by calling the to sql method All the data will be stored in a table named employees The table employees is created without a primary key so we execute the following SQL statement to add an id column that is set to be the primary key of the table with engine connect as conn conn execute ALTER TABLE employees ADD id INT NOT NULL AUTO INCREMENT PRIMARY KEY FIRST For PostgreSQL replace this line conn execute ALTER TABLE employees ADD id INT NOT NULL AUTO INCREMENT PRIMARY KEY FIRST With conn execute ALTER TABLE employees ADD COLUMN id SERIAL PRIMARY KEY MongoDBAll the required modules are now ready to be imported into the main script now it s time to create the mongodb py script First import the required libraries from multiprocessing import Poolfrom multiprocessing import cpu countimport pandas as pdFrom multiprocessing Pool and cpu count are required The Python Multiprocessing Pool class allows you to create and manage process pools in Python Then import the modules previously created from modules dataframe import create dataframefrom modules base import clientNow we create the multiprocessing pool configured to use all available CPU cores minus one Each core will call the create dataframe function and create a DataFrame with thousand records After each call to the function has finished all the DataFrames created will be concatenated into a single one if name main num cores cpu count with Pool as pool data pd concat pool map create dataframe range num cores data dict data to dict records db client company collection db employees collection insert many data dict After logging into the MongoDB server we specify the database and the collection where the data will be stored And finally we will insert the DataFrame into MongoDB by calling the insert many method All the data will be stored in a collection named employees Running the scriptRun the following statement to populate the table python sql pyOr python mongodb pyExecution time depends on the CPU cores available on your machine I m running this script on an Intel i P that has cores but using Query Your DataOnce the script finishes you can check the data in the database MySQL and PostgreSQLConnect to the company database MySQL use company PostgreSQL c companyThen get the number of records select count from employees The count function returns the number of records in the employees table count row in set sec MongoDBuse companydb employees count The count function returns the number of records in the employees table Or you can display the records in the employees table db employees find pretty The code shown in this blog post can be found on my GitHub account in the data generator repository This post was brought to you by PerconaWant to try Percona Monitoring and Management Check our demo 2023-02-07 14:16:36
海外TECH DEV Community GraphQL Union Types at Woovi https://dev.to/woovi/graphql-union-types-at-woovi-2ana GraphQL Union Types at WooviGraphQL is a popular query language for APIs that provides a flexible and efficient way to retrieve data from a server One of the key features of GraphQL is the ability to define union types which allow you to return multiple object types from a single field In this post we will take a closer look at union types in GraphQL and how to implement them in JavaScript using the GraphQL js library We will cover the basics of union types including their definition and usage and explore real world scenarios where they can be applied to simplify and improve the performance of your API Whether you are a seasoned GraphQL developer or just starting out this post will provide valuable insights into the power and versatility of union types Union typesUnion types are very similar to interfaces but they don t get to specify any common fields between the types See the official doc to understand more about the concept Union Types in GraphQL Code FirstWhen coding with graphql code first you can declare your union type like below For the follow example lets have in mind the following scenario we have an array field that is an array of payment methodsthis payment methods can be CREDIT CARD DEBIT CARDwhen saving the payment method each one will have it own fields with one in common methodthe method will carry the payment method typemethod CREDIT CARD Union Type Sub Typeswe start declaring a type for each payment methodexport const PaymentMethodCreditCardType new GraphQLObjectType name PaymentMethodCreditCard fields gt method type new GraphQLNonNull GraphQLString resolve paymentMethodCreditCard gt paymentMethodCreditCard method export const PaymentMethodDebitCardType new GraphQLObjectType name PaymentMethodDebitCard fields gt method type new GraphQLNonNull GraphQLString resolve paymentMethodDebitCard gt paymentMethodDebitCard method Union Type The Type itselfthen we declare the union type that will consume the both types above export const PaymentMethodUnionType new GraphQLUnionType name PaymentMethodUnion description Represents a generic payment method type credit card debit card types gt PaymentMethodCreditCardType PaymentMethodDebitCardType resolveType data Record lt string unknown gt Record lt string unknown gt gt if data method CREDIT CARD return PaymentMethodCreditCardType name if data method DEBIT CARD return PaymentMethodDebitCardType name return null note the resolveType will receive the value passed from the parentsince the field is an array it will be the items of arraywe map them by the method previously mentioned abovethen point to the right type Union Type Exposing itWe expose the paymentMethods as an array of the payment methods consuming the union typepaymentMethods type new GraphQLList PaymentMethodUnionType resolve payment gt payment paymentMethods Union Type Consuming itIn the frontend we just ask it to the graphql for example in a mutation where set the selected payment methods for it and return the field filled const query graphql mutation PaymentMethodAddMutation input PaymentMethodAddInput PaymentMethodAdd input input error payment id paymentMethods on PaymentMethodCreditCard method on PaymentMethodDebitCard method Now you can have your payment methods having specific fields to each one If you want to work in a startup in its early stages This is your chance Apply today Woovi is a Startup that enables shoppers to pay as they please To make this possible Woovi provides instant payment solutions for merchants to accept orders If you want to work with us we are hiring Photo of Martin Olsen na Unsplash 2023-02-07 14:14:56
海外TECH DEV Community Powering AI Capabilities with Apache APISIX and OpenAI API https://dev.to/apisix/powering-ai-capabilities-with-apache-apisix-and-openai-api-3912 Powering AI Capabilities with Apache APISIX and OpenAI APIArtificial intelligence AI has revolutionized the way we interact with technology and has become an integral part of modern applications The OpenAI API provides developers with powerful AI capabilities allowing them to build advanced AI applications with ease However as the usage of AI grows so does the need for scalable performant and secure API integrations This is where Apache APISIX comes in Apache APISIX is a high performance open source API gateway that provides advanced features for managing and scaling API integrations In this blog post we will explore the benefits of integrating Apache APISIX with the OpenAI API and how you can use Apache APISIX to create a more scalable performant and secure AI integration From proxy caching to security features we will cover everything you need to know to get started with Apache APISIX and OpenAI API integration Whether you re an AI developer or a DevOps professional this blog post is your complete guide to creating a powerful and cost effective AI integration Learning objectivesYou will learn the following throughout the article What are OpenAI API and Apache APISIX The benefits of using Apache APISIX with OpenAI API Several Apache APISIX plugins use cases to enhance OpenAI API How to create a new Route in APISIX for the OpenAI API How to add the OpenAI API endpoint as an Upstream for the route How to configure authentication rate limiting and caching for the route as needed How to test the route to make sure requests are being forwarded correctly to the OpenAI API What s OpenAI API OpenAI is a cutting edge platform for creating and deploying advanced artificial intelligence models These models can be used for a variety of tasks such as natural language processing image recognition and sentiment analysis One of the key benefits of OpenAI is that it provides an API that developers can use to access these models and incorporate them into their applications The OpenAI API is a cloud based platform that provides access to OpenAI s AI models including ChatGPT The API allows developers to integrate AI capabilities into their applications ChatGPT is just one of the AI models available through the OpenAI API and it is particularly well suited for use cases that require natural language processing and text generation capabilities For example ChatGPT can be used to generate text responses in a chatbot provide text completion suggestions code completion or answer questions in a conversational interface What s Apache APISIX Apache APISIX is an open source cloud native API traffic management solution that offers API Gateway features to create RESTful APIs that are scalable secure and highly available By using API Gateway with the OpenAI API you can easily create and deploy scalable secure and high performance APIs that access the OpenAI models This will allow you to incorporate the power of OpenAI into your applications and provide a great experience for your users What are the benefits of using Apache APISIX with OpenAI APIThere are several benefits of using Apache APISIX with OpenAI API Scalability Apache APISIX provides an easy way to manage and scale the OpenAI API allowing you to handle increased traffic and usage demands Performance Apache APISIX can help to improve the performance of OpenAI API requests by caching responses and reducing latency Security Apache APISIX provides security features such as encryption and authentication making it easy to secure access to the OpenAI API Flexibility Apache APISIX provides a flexible way to manage and control access to the OpenAI API allowing you to customize and configure your integration as needed Monitoring and Analytics Apache APISIX provides detailed monitoring and analytics allowing you to track and optimize the performance of your OpenAI API integration Apache APISIX plugins to enhance the OpenAI APIThere are several Apache APISIX plugins that can be used to enhance the integration with the OpenAI API Some of the plugins you can use with the OpenAI API include rate limiting To limit the number of API requests and prevent overuse of the OpenAI API authentication To secure access to the OpenAI API by implementing authentication and authorization mechanisms traffic control To control the flow of API traffic and ensure consistent performance and stability of the OpenAI API observability To monitor and log API requests and responses providing visibility into the usage and performance of the OpenAI API caching To cache API responses and reduce the number of API requests improving performance and reducing the cost of using the OpenAI API transformation To modify API requests and responses transforming data from one format to another such as JSON to XML Manage OpenAI APIs with Apache APISIX DemoWith enough theoretical knowledge in mind now we can jump into a practical session In this example Apache APISIX is used to create a simple API gateway that accesses the OpenAI API and manages the traffic by creating a route upstream and enabling some plugins We are going to interact with OpenAI API Completion endpoint to create a product description generator to generate the product description efficiently and accurately For example a typical request to the API Gateway will look like the below curl openai product desc X POST d model text davinci prompt Write a brief product description for Apple pro temperature max tokens And we will get as an output object text completion model text davinci choices text n nThe Apple Pro is the perfect laptop for those who need a powerful and reliable machine It features a inch Retina display with True Tone technology a powerful th generation Intel Core i processor GB of RAM and a GB SSD for storage It also has a Touch Bar and Touch ID for added security and convenience With up to hours of battery life you can stay productive all day long The Apple Pro is the perfect laptop for those who need a powerful and reliable machine index finish reason stop usage prompt tokens completion tokens total tokens PrerequisitesMust be familiar with fundamental OpenAI API completion model concepts Create an OpenAI API Key To access the OpenAI API you will need to create an API Key You can do this by logging into the OpenAI website and navigating to the API Key management page Docker installed on your machine to run APISIX Basic knowledge about couple of APISIX core concepts such as Route Upstream and Plugin Set up the projectThis first thing you clone the apisix docker project repo from GitHub git clone Open the project folder in your favorite code editor The tutorial leverages VS Code Install and run Apache APISIXTo run Apache APISIX you can follow these steps Open a new terminal window and run docker compose up command from the root folder of the project docker compose up dAbove command will run Apache APISIX and etcd together with Docker We installed APISIX using Docker in this demo However there are other options to install it on installation guide Create an Upstream for the OpenAI APIOnce the set up is complete we will create an Upstream object in APISIX using its Admin API Upstream in APISIX refers to the backend servers that are responsible for serving the actual request data In our case we define the upstream API server at api openai com with a single node and https scheme used when communicating securely with the Upstream curl apisix admin upstreams H X API KEY eddcfffadbcf X PUT d name OpenAI API upstream desc Add the OpenAI API domain as the upstream type roundrobin scheme https nodes api openai com Create a new plugin configNow we create a new plugin config with proxy rewrite plugin enabled The proxy plugin is used to redefine requests to the OpenAI API completion endpoint The plugin s configuration includes options to set the URL for the API endpoint pass along the OpenAI API key as a header and with the Content Type header set to application json curl apisix admin plugin configs H X API KEY eddcfffadbcf X PUT d plugins proxy rewrite uri v completions host api openai com headers Authorization OpenAI API Key Content Type application json Set up a Route for the OpenAI completion endpointIn the next step we set up a new Route in APISIX to handle POST requests with new custom API Gateway URI path openai product desc and we give references to created the upstream and plugin config in the previous steps by their unique Ids curl i apisix admin routes H X API KEY eddcfffadbcf X PUT d name OpenAI API completion route desc Create a new route in APISIX for the OpenAI API completion endpoint methods POST uri openai product desc upstream id plugin config id Additionally the route is set up with retries a timeout and a keepalive timeout to ensure robust and resilient communication with the OpenAI API Test With a Curl RequestTo test the API you can make a POST request to the endpoint openai product desc using a tool like cURL or Postman The API Gateway will forward the request to the OpenAI API completion endpoint and return the results successfully curl openai product desc X POST d model text davinci prompt Write a brief product description for Apple pro temperature max tokens Great We got response from the actual completion endpoint HTTP OKContent Type application json object text completion choices text n nThe Apple Pro is the perfect laptop index logprobs null finish reason stop Create a new consumer and add authenticationUp to now our API Gateway product description endpoint openai product desc is public and accessible by unauthorized users Although the communication between APISIX and OpenAI API is secured with the API Key in the header In this section we will enable the authentication feature to disallow unauthorized requests to our API To do so we need to create a new consumer for our endpoint and add basic auth plugin for the existing plugin config so that only allowed user can access them The below command will create our new consumer with its credentials such as username and password curl apisix admin consumers H X API KEY eddcfffadbcf X PUT d username consumer plugins basic auth username username password password Now we update the existing plugin config and append basic auth plugin to let APISIX s route check the request header with the API consumer credentials each time APIs are called curl apisix admin plugin configs H X API KEY eddcfffadbcf X PUT d plugins proxy rewrite uri v completions host api openai com headers Authorization OpenAI API Key Content Type application json basic auth Now only if we provide the correct user credentials in the request and access the same endpoint we can get the expected response from OpenAI API curl i u username password openai product desc X POST d model text davinci prompt Write a brief product description for Apple pro temperature max tokens Apply rate limiting policies for serverless APIsIn this section we will protect our product description endpoint from abuse by applying a throttling policy In Apache APISIX Gateway we can apply rate limiting to restrict the number of incoming calls Apply and test the rate limit policyWith the existing route configuration we can apply a rate limit policy with limit count plugin to protect our API from abnormal usage We will limit the number of API calls to per s per API consumer To enable limit count plugin for the existing route we need to add the plugin to plugins list in our Json plugin configuration curl apisix admin plugin configs H X API KEY eddcfffadbcf X PUT d plugins proxy rewrite uri v completions host api openai com headers Authorization OpenAI API Key Content Type application json basic auth limit count count time window rejected code rejected msg Requests are too frequent please try again later key type var key remote addr Apache APISIX will handle the first two requests as usual However a third request in the same period will return a HTTP Forbidden code with our custom error message curl i u username password openai product desc X POST d model text davinci prompt Write a brief product description for Apple pro temperature max tokens After the first callHTTP OKContent Type application jsonContent Length Connection keep aliveX RateLimit Limit X RateLimit Remaining After the second callHTTP Forbidden error msg Requests are too frequent please try again later Configure caching for the OpenAI API responseApache APISIX proxy caching is a feature of Apache APISIX that allows you to cache API responses and serve cached responses to subsequent requests This can help to reduce the number of API requests which means the reduction of the usage cost of OpenAI API improve the performance of your API integration and reduce the load on the API server Apache APISIX provides fine grained control over the caching behavior allowing you to specify the cache expiration time the conditions for cache invalidation and other caching policies In the below configuration we will define proxy cache plugin together with other plugins that we want to cache only successful product description responses from the POST method of Open AI API completion endpoint curl apisix admin plugin configs H X API KEY eddcfffadbcf X PUT d plugins proxy rewrite uri v completions host api openai com headers Authorization OpenAI API Key Content Type application json basic auth proxy cache cache key uri cache id cache method POST cache http status hide cache headers true We will send multiple requests to the openai product desc path and we should receive HTTP OK response each time However the Apisix Cache Status in the response shows MISS meaning that the response has not cached yet when the request hits the route for the first time Now if you make another request you will see that you get a cached response with the caching indicator as HIT The response looks like as below HTTP OK…Apisix Cache Status MISSWhen you do the next call to the service the route responds to the request with a cached response since it has already cached in the previous request HTTP OK…Apisix Cache Status HIT SummaryApache APISIX and OpenAI API integration involves combining the features of Apache APISIX an open source high performance microservices API gateway with the advanced artificial intelligence capabilities of the OpenAI API to enhance the functionality and performance of applications With this integration developers can leverage the scalability and performance of Apache APISIX to manage microservices while leveraging the cutting edge AI capabilities of OpenAI to deliver sophisticated and advanced features to their users At later stages you can deploy both APISIX and OpenAI runtime code to an application server or any public cloud to make them available in the production Throughout the post we demonstrated only a few examples of Apache APISIX plugins that can be used with the OpenAI API You can choose the plugins that best meet your needs and customize your Apache APISIX and OpenAI API integration to meet the specific requirements of your applications Related resourcesOpenAI API Recommended content most common use cases of an API GatewayHow to choose the right API GatewayWhy Is Apache APISIX the Best API Gateway CommunityJoin the Apache APISIX Community Follow us on TwitterFind us on Slack 2023-02-07 14:11:10
海外TECH DEV Community How to show a pop-up only once with React, localStorage and Material UI modal https://dev.to/bcncodeschool/how-to-show-a-pop-up-only-once-with-react-localstorage-and-material-ui-modal-n81 How to show a pop up only once with React localStorage and Material UI modalImagine we have a scenario where user comes to a website and has to see a pop up or modal with some information and has to click it to see the main content For example agree to Terms and Conditions read an important warning or merely see a call to action But to make it less annoying we only want to show it once to a new user and hide it for the returning users This can be easily done with localStorage where we can keep the data about user clicks on the pop up The way it will work would be the following New user comes to the website Our component renders We check if we have data in localStorage about this user already clicked on the pop up If we do not have it then in the state of component we set data displayPopUp to be truea Once user clicked to close pop up we set localStorage to popUpClickd to be true and state displayPopUp to be false If we do have it we set state to displayPopUp to be falseFirst in our react component we will import modal import useEffect and useStateimport useEffect useState from react import modal from material UIimport Modal Box from mui material Remember to install mui material with npm i mui materialThen let s declare a component with modal inside export default function Home state variable to decide if we should render pop up const displayPopUp setDisplayPopUp useState true when pop up is closed this function triggers we pass it to onClose property of the modal const closePopUp gt setting key seenPopUp with value true into localStorage localStorage setItem seenPopUp true setting state to false to not display pop up setDisplayPopUp false return lt gt lt div gt conditional rendering if displayPopUp is truthy we will show the modal displayPopUp amp amp lt Modal open true once pop up will close closePopUp function will be executed onClose closePopUp aria labelledby modal modal title aria describedby modal modal description gt lt Box sx style gt what user will see in the modal is defined below lt h gt Very important message lt h gt lt h gt Just press OK button to never see it again lt h gt lt button onClick closePopUp gt OK lt button gt lt Box gt lt Modal gt lt div gt this is the main content of this page lt div gt lt h gt The main content lt h gt lt div gt lt gt This will successfully render the page display pop up close it when user clicks and save true in the key seenPopUp in the localStorage of a browser But now every time we refresh the page the pop up will appear again although we ve already seen it To solve this let s add the last step useEffect to check if we have seenPopUp in localStorage the useEffect to trigger on first render and check if in the localStorage we already have data about user seen and closed the pop up useEffect gt getting value of seenPopUp key from localStorage let returningUser localStorage getItem seenPopUp if it s not there for a new user it will be null if it s there it will be boolean true setting the opposite to state false for returning user true for a new user setDisplayPopUp returningUser Now the final code for this entire component will look like this import useEffect and useStateimport useEffect useState from react import modal from material UIimport Modal Box from mui material write some style to pass into modalconst style fontFamily Montserrat position absolute top left transform translate width height border px solid hotpink boxShadow padding textAlign center background de export default function Home const displayPopUp setDisplayPopUp useState true when pop up is closed this function triggers const closePopUp gt setting key seenPopUp with value true into localStorage localStorage setItem seenPopUp true setting state to false to not display pop up setDisplayPopUp false the useEffect to trigger on first render and check if in the localStorage we already have data about user seen and closed the pop up useEffect gt getting value of seenPopUp key from localStorage let returningUser localStorage getItem seenPopUp if it s not there for a new user it will be null if it s there it will be boolean true setting the opposite to state false for returning user true for a new user setDisplayPopUp returningUser return lt gt lt div gt conditional rendering if displayPopUp is truthy we will show the modal displayPopUp amp amp lt Modal open true once pop up will close closePopUp function will be executed onClose closePopUp aria labelledby modal modal title aria describedby modal modal description gt in the line below we pass our custom styles object to the modal via sx prop lt Box sx style gt what user will see in the modal is defined below lt h gt Very important message lt h gt lt h gt Just press OK button to never see it again lt h gt lt button onClick closePopUp gt OK lt button gt lt Box gt lt Modal gt lt div gt this is the main content of this page lt div gt lt h gt The main content lt h gt lt div gt lt gt Et voilà Please remember that localStorage belongs to a specific browser so if a user open this page in Safari they will not see pop up next time they open the page in the same Safari But if they will open the page in another browser for example in Google Chrome they will see the pop up again Link to the sandbox with codeCheers 2023-02-07 14:11:10
Apple AppleInsider - Frontpage News Daily Deals Feb. 7: 2 $100 Restaurant.com eGift cards for $20, 50% off Bose Frames sunglasses & more https://appleinsider.com/articles/23/02/07/daily-deals-feb-7-2-100-restaurantcom-egift-cards-for-20-50-off-bose-frames-sunglasses-more?utm_medium=rss Daily Deals Feb Restaurant com eGift cards for off Bose Frames sunglasses amp moreSome of the hottest deals we found today include a Valentine s Dinner for two for off EarPods off a Dell Latitude notebook off an iPhone Pro Max and a Vizio Smart TV for Get an Apple Watch Series from The AppleInsider team searches the web for can t miss deals at online retailers to create a list of deals on the top tech gadgets including discounts on Apple products TVs accessories and other items We post the best finds in our Daily Deals list to help put more money in your pocket Read more 2023-02-07 14:58:58
Apple AppleInsider - Frontpage News Future HomeKit could track you through your house and predict your needs https://appleinsider.com/articles/23/02/07/future-homekit-could-track-you-through-your-house-and-predict-your-needs?utm_medium=rss Future HomeKit could track you through your house and predict your needsA future version of HomeKit may keep track of where the people in the house are and learn user habits to figure out when to automatically take actions without you having to ask Siri The temperature and humidity sensors in the new HomePod and now enabled in the HomePod mini are designed to be used as part of automated systems Right now you can create a Shortcut that says if the indoor temperature falls below a certain point you want your heater turned on That does require a heater you can control remotely or perhaps just a smart outlet plus the HomePod But it also requires you to set up that Shortcut and Using In home Location Awareness a newly granted patent suggests that Apple wants to move away from that Read more 2023-02-07 14:41:54
海外TECH Engadget The next Nintendo Direct will take place on February 8th https://www.engadget.com/nintendo-direct-stream-date-switch-games-144620838.html?src=rss The next Nintendo Direct will take place on February thNintendo has scheduled its first Direct of the year It will largely focus on games that are coming to Switch in the first half of You ll be able to watch the Direct on February th at PM ET on the company s YouTube channel What Nintendo will feature during the stream largely remains a mystery but we can read the tea leaves and speculate a bit While it s possible that we ll get a deeper dive into The Legend of Zelda Tears of the Kingdom before that game arrives in May it d be surprising if there were nothing about it at all in this Direct Rumor has it that Nintendo may suddenly release Advance Wars Re Boot Camp this week after a lengthy delay related to Russia s invasion of Ukraine If so expect that to make an appearance Tune in at p m PST tomorrow Feb for a NintendoDirect livestream featuring roughly minutes of information mostly focused on NintendoSwitch games launching in the first half of Watch it live here pic twitter com PmfdQWIwーNintendo of America NintendoAmerica February With the Mario movie on the horizon perhaps we ll learn something about the future of Nintendo s mascot on Switch Super Nintendo World will open at Universal Studios Hollywood next week so expect at least a mention of that We may learn more about Pikmin as well while Nintendo has Bayonetta Origins Cereza and the Lost Demon and Kirby s Return to Dream Land Deluxe lined up for this year And then of course there s the ever present expectation from fans that this after years of patiently waiting is when we ll finally get a release date for Hollow Knight Silksong which is supposed to arrive in the first half of this year In any case we won t have to wait too long to find out what Nintendo has up its sleeve 2023-02-07 14:46:20
Cisco Cisco Blog How Internet Outages Impact Transportation https://blogs.cisco.com/transportation/how-internet-outages-impact-transportation How Internet Outages Impact TransportationThe Internet is a critical component of transportation and logistics With the vast amount of data available it is easy to see how the Internet can help companies in the transportation sector do better capacity planning for their fleets pick the best apps for planning and logistics and solve problems proactively with support from alerts and fault codes Most importantly though continuously connected systems can vastly improve passenger experience and safety which is what matters in the end 2023-02-07 14:30:31
海外TECH CodeProject Latest Articles Continuous Integration for Windows on Arm https://www.codeproject.com/Articles/5353967/Continuous-Integration-for-Windows-on-Arm armthis 2023-02-07 14:56:00
海外TECH WIRED The Antibiotic Resistance Crisis Has a Troubling Twist https://www.wired.com/story/the-antibiotic-resistance-crisis-has-a-troubling-twist/ degradation 2023-02-07 14:30:00
金融 RSS FILE - 日本証券業協会 パブリックコメントの募集の結果について https://www.jsda.or.jp/about/public/kekka/index.html 募集 2023-02-07 15:00:00
金融 金融庁ホームページ 地域企業への経営人材マッチングに関するイベントを開催します。 https://www.fsa.go.jp/news/r4/ginkou/20230207/20230207.html 経営 2023-02-07 16:00:00
ニュース BBC News - Home Epsom College deaths: Teacher and daughter shot by husband, police believe https://www.bbc.co.uk/news/uk-england-surrey-64544884?at_medium=RSS&at_campaign=KARANGA george 2023-02-07 14:32:37
ニュース BBC News - Home David Carrick: Serial rapist Met Police officer jailed for at least 30 years https://www.bbc.co.uk/news/uk-england-beds-bucks-herts-64540800?at_medium=RSS&at_campaign=KARANGA advantage 2023-02-07 14:20:45
ニュース BBC News - Home Kaylea Titford: Dad found guilty of daughter's manslaughter https://www.bbc.co.uk/news/uk-wales-64538142?at_medium=RSS&at_campaign=KARANGA kaylea 2023-02-07 14:55:15
ニュース BBC News - Home Nicola Bulley: Nothing making sense in missing mum case, friend says https://www.bbc.co.uk/news/uk-england-lancashire-64549298?at_medium=RSS&at_campaign=KARANGA bulley 2023-02-07 14:42:23
ニュース BBC News - Home Turkey earthquake: Three Britons missing, says Foreign Office https://www.bbc.co.uk/news/uk-64557448?at_medium=RSS&at_campaign=KARANGA gaziantep 2023-02-07 14:56:26
ニュース BBC News - Home Turkey and Syria earthquake: 'The hospital was collapsing with my son inside' https://www.bbc.co.uk/news/world-middle-east-64538295?at_medium=RSS&at_campaign=KARANGA massive 2023-02-07 14:25:07
GCP Google Cloud Platform Japan 公式ブログ 安全でコンプライアンスに対応した ML / AI を導入するための Vertex AI の基盤 https://cloud.google.com/blog/ja/topics/developers-practitioners/vertex-ai-foundations-secure-and-compliant-mlai-deployment/ VertexAITrainingでは、VertexAIマネージドデータセットを使用してカスタムモデルをトレーニングできます。 2023-02-07 15:20:00
GCP Cloud Blog JA 安全でコンプライアンスに対応した ML / AI を導入するための Vertex AI の基盤 https://cloud.google.com/blog/ja/topics/developers-practitioners/vertex-ai-foundations-secure-and-compliant-mlai-deployment/ VertexAITrainingでは、VertexAIマネージドデータセットを使用してカスタムモデルをトレーニングできます。 2023-02-07 15:20:00

コメント

このブログの人気の投稿

投稿時間:2021-06-17 22:08:45 RSSフィード2021-06-17 22:00 分まとめ(2089件)

投稿時間:2021-06-20 02:06:12 RSSフィード2021-06-20 02:00 分まとめ(3871件)

投稿時間:2021-06-17 05:05:34 RSSフィード2021-06-17 05:00 分まとめ(1274件)