投稿時間:2023-05-28 21:12:40 RSSフィード2023-05-28 21:00 分まとめ(16件)

カテゴリー等 サイト名等 記事タイトル・トレンドワード等 リンクURL 頻出ワード・要約等/検索ボリューム 登録日
TECH Techable(テッカブル) 仕事でどんなときにChatGPTを活用している?1位はメール文章などの生成・校正 https://techable.jp/archives/208302 chatgpt 2023-05-28 11:00:27
python Pythonタグが付けられた新着投稿 - Qiita MycobotをTouchDesigner(python)で動かす https://qiita.com/tsunk66/items/16bae71d6c2bcd42cd8f mycobot 2023-05-28 20:49:47
js JavaScriptタグが付けられた新着投稿 - Qiita 大学生が自分のプログラミング学習の見直しをする話 https://qiita.com/Flat2027/items/2171aa48993f013c81f0 htmlcss 2023-05-28 20:13:04
Docker dockerタグが付けられた新着投稿 - Qiita 遅くともやりたい!Docker初心者が頑張って基礎をまとめる https://qiita.com/whiteeel/items/2c264c7fa7f45d091ff1 docker 2023-05-28 20:24:09
golang Goタグが付けられた新着投稿 - Qiita ChatGPTをターミナルですくすくたたけるCLI https://qiita.com/cdim/items/df8836c010f845f7c306 chatgpt 2023-05-28 20:11:58
技術ブログ Developers.IO Automating the process of building the application, installing the dependencies, synchronizing with S3 bucket, and clearing CloudFront distribution cache https://dev.classmethod.jp/articles/automating-the-process-of-building-the-application-installing-the-dependencies-synchronizing-with-s3-bucket-and-clearing-cloudfront-distribution-cache/ Automating the process of building the application installing the dependencies synchronizing with S bucket and clearing CloudFront distribution cacheIntroduction Hemanth of Alliance Department here In this blog I tried automating the process of building the 2023-05-28 11:19:27
海外TECH MakeUseOf How to Migrate Your Old Minecraft Account Before It’s Too Late https://www.makeuseof.com/how-to-migrate-old-minecraft-account/ How to Migrate Your Old Minecraft Account Before It s Too LateTo Minecraft players with Mojang accounts you need to migrate your account or risk losing all of your Minecraft progress Here s how you can do that 2023-05-28 11:37:29
海外TECH MakeUseOf What Is Sony’s Project Q? https://www.makeuseof.com/what-is-sonys-project-q/ project 2023-05-28 11:05:47
海外TECH DEV Community File Uploads Made Easy with Multer package in nodeJS https://dev.to/itsvinayak/file-uploads-made-easy-with-multer-package-in-nodejs-3mo0 File Uploads Made Easy with Multer package in nodeJSIn today s world file uploads have become a very common feature in web applications Node js provides an easy way to handle file uploads in web applications With the help of third party packages like multer and formidable file uploads can be handled with ease In this article we will explore how to handle file uploads in Node js using Multer and build an API Backend for Uploading Single and Multiple Files MulterMulter provides an easy way to handle file uploads in Node js It provides a lot of options to customize the file upload process and it has the following features File uploads with customizable file names and storage locationLimiting the file sizeFiltering the files based on their MIME typesHandling multiple files at onceIf you are building a web application with file upload functionality Multer is worth considering RequirementsExpress js Set up an Express js server to handle HTTP requests and build the API endpoints Multer Install the Multer middleware package using NPM Multer is a popular middleware that enables file uploads in Node js Initialization NodeJS ProjectSetting up project directory Create a new directory for this project on your local machine with the name fileShare mkdir fileShareI nitialize the project Open your terminal or command prompt navigate to the project directory you created and run the following command to initialize a new Node js project npm initThis command will prompt you to provide information about your project such as the project name version description entry point etc You can either fill in the details or press enter to accept the default values Install required dependencies Here we will install express js multer and nodemon for creating this Projectnpm install save expressnpm install save multernpm install save dev nodemonSetting up package json name fileshare version description main index js scripts dev nodemon index mjs test echo Error no test specified amp amp exit author license ISC dependencies dotenv express multer lts devDependencies nodemon create an index mjs file This index mjs file will contain our code for handling file uploadtouch index mjs Folder Structure Implementation Setting up an Express Serverimport express from express import dotenv from dotenv const PORT process env PORT const env process env NODE ENV development if env development dotenv config else dotenv config path env prod console log function const app express app listen PORT gt console log Server started on port PORT Create a folder to upload filesimport express from express import path from path import fs from fs import dotenv from dotenv const PORT process env PORT const env process env NODE ENV development if env development dotenv config else dotenv config path env prod console log function const app express const dirname path resolve console log dirname dirname const uploadFolder path join dirname uploads console log uploadFolder uploadFolder Create a folder if not existfs mkdirSync uploadFolder recursive true app listen PORT gt console log Server started on port PORT In the code above fs mkdirSync will create a folder if dont exist this folder will be created in the current directoryInitialization of Multerimport express from express import multer from multer import path from path import fs from fs import dotenv from dotenv const PORT process env PORT const env process env NODE ENV development if env development dotenv config else dotenv config path env prod console log function const app express const dirname path resolve console log dirname dirname const uploadFolder path join dirname uploads console log uploadFolder uploadFolder Create a folder if not existfs mkdirSync uploadFolder recursive true const storage multer diskStorage destination function req file cb console log file file return cb null uploadFolder filename function req file cb console log file from filename function file let origFileName file originalname split let filename origFileName Date now path extname file originalname console log filename filename return cb null filename const upload multer storage storage app listen PORT gt console log Server started on port PORT Creating a POST endpoint to handle single and multiple file uploadimport express from express import multer from multer import path from path import fs from fs import dotenv from dotenv const PORT process env PORT const env process env NODE ENV development if env development dotenv config else dotenv config path env prod console log function const app express const dirname path resolve console log dirname dirname const uploadFolder path join dirname uploads console log uploadFolder uploadFolder Create a folder if not existfs mkdirSync uploadFolder recursive true const storage multer diskStorage destination function req file cb console log file file return cb null uploadFolder filename function req file cb console log file from filename function file let origFileName file originalname split let filename origFileName Date now path extname file originalname console log filename filename return cb null filename const upload multer storage storage app post api upload upload single file req res gt console log req file req file res status json status success message File upload successfully upload multiple filesapp post api upload multiple upload array files req res gt console log req files req files res status json files req files app listen PORT gt console log Server started on port PORT In the code above we set up a storage engine for multer to define where the uploaded files will be stored and then set up multer middleware to handle the file uploads Create a GET endpoint to get all uploaded filesimport express from express import multer from multer import path from path import fs from fs import dotenv from dotenv const PORT process env PORT const env process env NODE ENV development if env development dotenv config else dotenv config path env prod console log function const app express const dirname path resolve console log dirname dirname const uploadFolder path join dirname uploads console log uploadFolder uploadFolder Create a folder if not existfs mkdirSync uploadFolder recursive true const storage multer diskStorage destination function req file cb console log file file return cb null uploadFolder filename function req file cb console log file from filename function file let origFileName file originalname split let filename origFileName Date now path extname file originalname console log filename filename return cb null filename const upload multer storage storage app get req res gt let fileList fs readdirSync uploadFolder console log fileList fileList res status json fileList app post api upload upload single file req res gt console log req file req file res status json status success message File upload successfully upload multiple filesapp post api upload multiple upload array files req res gt console log req files req files res status json files req files app listen PORT gt console log Server started on port PORT With this API backend in place you can easily handle file uploads in your web application Testing API EndpointsUploading a single file using the CURL commandcurl location request POST http localhost api upload form file filename one txt Uploading multiple files using the CURL commandcurl location request POST http localhost api upload multiple form files filename one txt form files filename two txt ConclusionHandling file uploads in Node js has become very easy with the help of packages like Multer it provides an easy way to handle file uploads with a lot of customization options If you are building a web application with file upload functionality The complete source code for this project is readily available on GitHub By accessing the repository you can explore the code review its structure and utilize it to enhance your understanding or further develop your own Node js projects Feel free to visit the GitHub repository to access the complete source code and leverage it for your needs Link 2023-05-28 11:34:28
海外TECH DEV Community Getting Started: Monitoring a FastAPI App with Grafana and Prometheus - A Step-by-Step Guide https://dev.to/ken_mwaura1/getting-started-monitoring-a-fastapi-app-with-grafana-and-prometheus-a-step-by-step-guide-3fbn Getting Started Monitoring a FastAPI App with Grafana and Prometheus A Step by Step Guide IntroductionMonitoring plays a crucial role in ensuring the performance availability and stability of FastAPI applications By closely tracking key metrics and identifying potential issues developers can proactively address them and deliver a better user experience In this guide we will explore how to set up monitoring for a FastAPI app using two powerful tools Grafana and Prometheus What is Prometheus Prometheus is an open source monitoring system that collects metrics from your application and stores them in a time series database It can be used to monitor the performance of your application and alert you when something goes wrong What is Grafana Grafana is an open source visualization tool that can be used to create dashboards for your application It can be used to create dashboards that show the status of your application Overview of monitoring FastAPI appsMonitoring is an important part of any application It helps you to understand how your application is performing and how it is being used It also helps you to identify and fix issues before they become a problem There are many tools available for monitoring applications but they all have their own pros and cons In this guide we will be using Prometheus and Grafana to monitor our FastAPI app Importance of Grafana and Prometheus in monitoringGrafana is a tool for visualizing data It can be used to create dashboards that show the status of your application Prometheus is a tool for collecting metrics from your application It can be used to collect metrics such as CPU usage memory usage and network traffic PrerequisitesDockerDocker ComposePython and PipTerminal or Command PromptText Editor or IDE VS Code PyCharm etc Basic knowledge of FastAPI Docker and PythonBasic knowledge of Prometheus and GrafanaBasic knowledge of Docker amp Docker Compose Project SetupInorder to keep we ll use an existing FastAPI app for this guide You can clone the repo here However if you want to create use your own FastAPI app feel free to do so git clone Once you have cloned the repo you can run the following commands to create a virtualenv and install the dependencies cd Fast Api examplepython m venv venvsource venv bin activatecd srcpip install r requirements txtInorder to run the app you can run the following command uvicorn app main app reload workers host port The command above will start the app on port You can access the app by visiting http localhost docs in your browser Feel free to change the command to suit your needs The default port is usually Setting up PrometheusA Installation and Configuration of Prometheus with DockerInstall Docker on your system if not already installed Pull the Prometheus Docker image from the official repository using the command docker pull prom prometheusCreate a folder prometheus data inside it create a configuration file named prometheus yml to define Prometheus settings and targets Example configuration global scrape interval sscrape configs job name fastapi app static configs targets web This configuration specifies the scrape interval and sets the FastAPI app s target to be monitored Start the Prometheus container using the following command docker run p v path to prometheus yml etc prometheus prometheus yml prom prometheusReplace path to prometheus yml with the actual path to your prometheus yml configuration file Access Prometheus by navigating to http localhost in your web browser You should see the Prometheus web interface B Instrumenting FastAPI App for Prometheus MetricsCeate a Virtual Environment for the FastAPI app and activate it then install the required Python libraries for the app to run for Prometheus integration python m venv venvsource venv bin activatecd srcpip install r requirements txtpip install prometheus fastapi instrumentatorIn your FastAPI app s main file in this case its found in src app main py import the Instrumentator class from prometheus fastapi instrumentator from prometheus fastapi instrumentator import InstrumentatorInitialize and instrument your FastAPI app with the Instrumentator Instrumentator instrument app expose app This step automatically adds Prometheus metrics instrumentation to your FastAPI app and exposes the metrics endpoint Restart your FastAPI app to apply the instrumentation changes If everything is succesfull there should be a new endpoint at http localhost metrics that returns the Prometheus metrics Connecting Prometheus and GrafanaIn order to connect Prometheus and Grafana we will be using the Prometheus data source plugin for Grafana This plugin allows you to connect Grafana to Prometheus and create dashboards that show the status of your application Installing the Prometheus data source plugin for GrafanaIn order to install the Prometheus data source plugin for Grafana you will need to download the plugin from the Grafana website You can download the plugin from here However in docker the plugin is already installed Once you have downloaded the plugin you can install it by running the following command grafana cli plugins install grafana prometheus datasource Running the app with Docker ComposeNow that we have our app running we can use Docker Compose to run it with Prometheus and Grafana We will be using the official Prometheus and Grafana images from Docker Hub Docker Compose fileWe will be using a Docker Compose file to run our app with Prometheus and Grafana The Docker Compose file will contain the following services FastAPI app as web servicePrometheus as prometheus serviceGrafana as grafana servicePostgres as a database serviceThe Docker Compose file will also contain the following volumes Prometheus data volumeGrafana data volumeThe Docker Compose file will also contain the following network hello fastapi networkNow create a file named docker compose yml in the root directory of your project and add the following code version services web build src command uvicorn app main app reload workers host port volumes src usr src app ports environment DATABASE URL postgresql hello fastapi hello f gt Verify that Prometheus is scraping the metrics from your FastAPI app by visiting lt http localhost targets gt in your web browser The FastAPI app target should be listed with a UP state With Prometheus now installed and configured in a Docker container and your FastAPI app instrumented with Prometheus metrics you are ready to move on to the next steps of integrating Grafana for visualization and analysis astapi db hello fastapi dev depends on db db image postgres alpine volumes postgres data var lib postgresql data environment POSTGRES USER hello fastapi POSTGRES PASSWORD hello fastapi POSTGRES DB hello fastapi dev ports prometheus image prom prometheus container name prometheus ports volumes prometheus data prometheus yml etc prometheus prometheus yml command config file etc prometheus prometheus yml grafana image grafana grafana container name grafana ports volumes grafana data var lib grafanavolumes prometheus data driver local driver opts o bind type none device prometheus data grafana data driver local driver opts o bind type none device grafana data postgres data networks default name hello fastapiLets go through the code above and see what each part does version This is the version of the Docker Compose file format we are using You can find more information about the Docker Compose file format here services This is the start of the services section of the Docker Compose file This section contains all the services we want to run with Docker Compose Each service will run in its own container For more information about Docker Compose services you can read the Docker Compose documentation prometheus This is the start of the prometheus service This service will run the Prometheus image from Docker Hub image prom prometheus This is the image we want to run with the prometheus service This image is the official Prometheus image from Docker Hub container name prometheus This is the name of the container we want to run with the prometheus service This name will be used to refer to the container in other parts of the Docker Compose file ports This is the start of the ports section of the prometheus service This section contains all the ports we want to expose with the prometheus service This is the port we want to expose with the prometheus service This port will be used to access the Prometheus web interface volumes This is the start of the volumes section of the prometheus service This section contains all the volumes we want to mount with the prometheus service prometheus data prometheus yml etc prometheus prometheus yml This is the volume we want to mount with the prometheus service This volume will be used to store the Prometheus configuration file command This is the start of the command section of the prometheus service This section contains all the commands we want to run with the prometheus service config file etc prometheus prometheus yml This is the command we want to run with the prometheus service This command will be used to specify the location of the Prometheus configuration file networks Here we specify the network we want to use for the prometheus service This network will be used to connect the prometheus service to the other services grafana This is the start of the grafana service This service will run the Grafana image from Docker Hub image grafana grafana This is the image we want to run with the grafana service This image is the official Grafana image from Docker Hub container name grafana This is the name of the container we want to run with the grafana service This name will be used to refer to the container in other parts of the Docker Compose file ports This is the start of the ports section of the grafana service This section contains all the ports we want to expose with the grafana service This is the port we want to expose with the grafana service This port will be used to access the Grafana web interface volumes This is the start of the volumes section of the grafana service This section contains all the volumes we want to mount with the grafana service grafana data var lib grafana This is the volume we want to mount with the grafana service This volume will be used to store the Grafana data networks Here we specify the network we want to use for the grafana service This network will be used to connect the grafana service to the other services volumes This is the start of the volumes section of the Docker Compose file This section contains all the volumes we want to mount with the Docker Compose file prometheus data This is the volume we want to mount with the Docker Compose file This volume will be used to store the Prometheus data Prometheus configuration fileAs mentioned earlier prometheus needs a configuration file to know what to monitor We will be using the default configuration file that comes with Prometheus However we will need to make some changes to the configuration file to make it work with our app Now update the prometheus data prometheus yml file with the following code config file for prometheus global configglobal scrape interval s scrape timeout s evaluation interval salerting alertmanagers follow redirects true enable http true scheme http timeout s api version v static configs targets scrape configs job name prometheus honor timestamps true scrape interval s scrape timeout s metrics path metrics scheme http follow redirects true enable http true static configs targets localhost job name fastapi scrape interval s metrics path metrics static configs targets web Lets go through the code above and see what each part does global This is the start of the global section of the Prometheus configuration file This section contains all the global settings for Prometheus scrape interval s This is the scrape interval for Prometheus This setting tells Prometheus how often to scrape the targets scrape timeout s This is the scrape timeout for Prometheus This setting tells Prometheus how long to wait for a scrape to complete before timing out evaluation interval s This is the evaluation interval for Prometheus This setting tells Prometheus how often to evaluate the rules alerting This is the start of the alerting section of the Prometheus configuration file This section contains all the alerting settings for Prometheus alertmanagers This is the start of the alertmanagers section of the Prometheus configuration file This section contains all the alertmanagers settings for Prometheus follow redirects true This setting tells Prometheus to follow redirects when sending alerts to the alertmanager enable http true This setting tells Prometheus to enable HTTP when sending alerts to the alertmanager scheme http This setting tells Prometheus to use HTTP when sending alerts to the alertmanager timeout s This setting tells Prometheus how long to wait for a response from the alertmanager before timing out api version v This setting tells Prometheus to use the v API when sending alerts to the alertmanager static configs This is the start of the static configs section of the Prometheus configuration file This section contains all the static configs settings for Prometheus targets This setting tells Prometheus to use the default alertmanager scrape configs This is the start of the scrape configs section of the Prometheus configuration file This section contains all the scrape configs settings for Prometheus job name prometheus This is the name of the job we want to scrape with Prometheus This name will be used to refer to the job in other parts of the Prometheus configuration file honor timestamps true This setting tells Prometheus to honor timestamps when scraping the job scrape interval s This is the scrape interval for the job This setting tells Prometheus how often to scrape the job scrape timeout s This is the scrape timeout for the job This setting tells Prometheus how long to wait for a scrape to complete before timing out metrics path metrics This is the metrics path for the job This setting tells Prometheus where to find the metrics for the job scheme http This setting tells Prometheus to use HTTP when scraping the job follow redirects true This setting tells Prometheus to follow redirects when scraping the job enable http true This setting tells Prometheus to enable HTTP when scraping the job static configs This is the start of the static configs section of the Prometheus configuration file This section contains all the static configs settings for Prometheus targets This is the start of the targets section of the Prometheus configuration file This section contains all the targets settings for Prometheus localhost This is the target we want to scrape with Prometheus This target will be used to refer to the target in other parts of the Prometheus configuration file job name fastapi This is the name of the job we want to scrape with Prometheus This name will be used to refer to the job in other parts of the Prometheus configuration file scrape interval s This is the scrape interval for the job This setting tells Prometheus how often to scrape the job metrics path metrics This is the metrics path for the job This setting tells Prometheus where to find the metrics for the job static configs This is the start of the static configs section of the Prometheus configuration file This section contains all the static configs settings for Prometheus targets This is the start of the targets section of the Prometheus configuration file This section contains all the targets settings for Prometheus web This is the target we want to scrape with Prometheus This target will be used to refer to the target in other parts of the Prometheus configuration file We use web as the target for Prometheus because that is the name of the service we defined in the Docker Compose file If you want to learn more about the Prometheus configuration file you can read the Prometheus documentation Running the appNow that we have the Docker Compose file and the Prometheus configuration file we can run the app To run the app we need to run the following command docker compose up dVerify that Prometheus is scraping the metrics from your FastAPI app by visiting http localhost targets in your web browser The FastAPI app target should be listed with a UP state Example screenshot Grafana DashboardNow that we have Prometheus running we can create a Grafana dashboard to visualize the metrics from our FastAPI app To create a Grafana dashboard we need to do the following Create a new Grafana dashboard Add a new Prometheus data source Add a new graph panel Add a new query to the graph panel Apply the changes to the graph panel Save the dashboard View the dashboard Repeat steps for each metric you want to visualize Repeat steps for each dashboard you want to create Repeat steps for each app you want to monitor Create a new Grafana dashboardOnce you have grafana running go to localhost You should see the following screen Enter the default username and password admin admin and click Log In You should be prompted to change the password Enter a new password and click Save You should see the following screen Click on the Create your first data source button You should see the following screen Click on the Prometheus button You should see the following screen Enter the following information Name PrometheusURL http prometheus Access Server Default Scrape interval sHTTP Method GETHTTP Auth NoneBasic Auth NoneWith Credentials NoTLS Client Auth NoneTLS CA Certificate NoneClick on the Save amp Test button You should see the following screen Click on the Dashboards button You should see the following screen Click on the New Dashboard button You should see the following screen Click on the Add Visualization button You should see the following screen Here you can select the type of visualization you want to add to the dashboard For this example we will select the Time Series visualization You should see the following screen Now lets add a query to the graph Click on the Query button You should see the following screen Grafana add query Grafana provides a query builder we can use to select the metrics we want to visualize Click on the Metrics button We ll use api request duration seconds count as the metric we want to visualize Click on the label filters button Select endpoint from there select notes id Click Add Filter Select http status from there select Now click Run Query You should see the tome series graph for the api request duration seconds count metric Enter panel title api request duration seconds count and click Apply Click on the Save button You should see the following screen Rince and repeat modifying for each metric you want to visualize You can also add multiple graphs to the same dashboard Feel free to use my Grafana dashboard as a starting point Find the JSON file in the GitHub repo Sample dashboard ConclusionIn this article we learned how to monitor a FastAPI app using Prometheus and Grafana We learned how to create a Docker Compose file to run Prometheus and Grafana We also learned how to create a Prometheus configuration file to scrape the metrics from our FastAPI app Finally we learned how to create a Grafana dashboard to visualize the metrics from our FastAPI app Thanks for reading Free free to leave a comment below if you have any questions or suggestions You can also reach out to me on Twitter If you found this article helpful feel free to share it with others ReferencesFastAPIPrometheusGrafanaDockerDocker ComposeDocker Compose file referenceDocker Compose networkingDocker Compose environment variablesDocker Compose volumes 2023-05-28 11:33:18
Apple AppleInsider - Frontpage News Crime blotter: Guilty verdict in North Carolina iPhone fraud trial https://appleinsider.com/articles/23/05/28/crime-blotter-guilty-verdict-in-north-carolina-iphone-fraud-trial?utm_medium=rss Crime blotter Guilty verdict in North Carolina iPhone fraud trialIn the latest Apple Crime Blotter a court employee is convicted of illegally wiping iPad a MacBook theft leads to a chase and a robber steals Apple Watch along with an ATM The Apple Store at Willow Grove Mall in PennsylvaniaThe latest in an occasional AppleInsider series looking at the world of Apple related crime Read more 2023-05-28 11:10:41
ニュース BBC News - Home Ukraine war: Kyiv hit by new massive Russian drone attack https://www.bbc.co.uk/news/world-65736730?at_medium=RSS&at_campaign=KARANGA russia 2023-05-28 11:11:17
ニュース BBC News - Home Steve Barclay admits some new hospitals won't be brand new https://www.bbc.co.uk/news/uk-politics-65737681?at_medium=RSS&at_campaign=KARANGA hospitals 2023-05-28 11:52:49
ニュース BBC News - Home Two men die after being pulled from sea in Devon https://www.bbc.co.uk/news/uk-england-65739081?at_medium=RSS&at_campaign=KARANGA devonrescue 2023-05-28 11:46:22
ニュース BBC News - Home Kostyuk booed after avoiding Sabalenka handshake https://www.bbc.co.uk/sport/tennis/65738855?at_medium=RSS&at_campaign=KARANGA french 2023-05-28 11:20:08
ニュース Newsweek 米大物政治家の18歳の娘、まさかの「プレイボーイ」のモデルに...大胆なビキニとドレス姿を披露 https://www.newsweekjapan.jp/stories/world/2023/05/18-35.php クラウディアは母親を非難したが、年月、米オンライン・マガジン「バサル」のインタビューで、母親が意図的に画像を投稿したとは思っていないと認め、画像は母親が自分の携帯電話を調べていたときに見つけたものだと説明した。 2023-05-28 20:10:00

コメント

このブログの人気の投稿

投稿時間:2021-06-17 22:08:45 RSSフィード2021-06-17 22:00 分まとめ(2089件)

投稿時間:2021-06-20 02:06:12 RSSフィード2021-06-20 02:00 分まとめ(3871件)

投稿時間:2021-06-17 05:05:34 RSSフィード2021-06-17 05:00 分まとめ(1274件)