投稿時間:2022-07-25 19:46:56 RSSフィード2022-07-25 19:00 分まとめ(58件)

カテゴリー等 サイト名等 記事タイトル・トレンドワード等 リンクURL 頻出ワード・要約等/検索ボリューム 登録日
IT @IT Master of IP Networkフォーラム 最新記事一覧 5G Advancedと6Gで何が変わるか、次は企業向け市場にフォーカス https://atmarkit.itmedia.co.jp/ait/articles/2207/25/news144.html abiresearch 2022-07-25 18:15:00
IT @IT 全フォーラム 最新記事一覧 5G Advancedと6Gで何が変わるか、次は企業向け市場にフォーカス https://atmarkit.itmedia.co.jp/ait/articles/2207/25/news144.html abiresearch 2022-07-25 18:15:00
IT ITmedia 総合記事一覧 [ITmedia ビジネスオンライン] 2022年上半期ビジネス書ランキング 2位『1日1話、読めば心が熱くなる365人の生き方の教科書』、1位は? https://www.itmedia.co.jp/business/articles/2207/25/news160.html itmedia 2022-07-25 18:45:00
IT ITmedia 総合記事一覧 [ITmedia News] 大阪王将の冷蔵庫にナメクジ? 自称従業員がTwitterで告発 同社は声明で対応 https://www.itmedia.co.jp/news/articles/2207/25/news180.html itmedia 2022-07-25 18:28:00
python Pythonタグが付けられた新着投稿 - Qiita 第4回|構文解析の方法と成果物を説明する https://qiita.com/venect_qiita/items/71409bcef2caac9f2843 位置づけ 2022-07-25 18:59:28
python Pythonタグが付けられた新着投稿 - Qiita 第3回|感情分析の方法と成果物を説明する https://qiita.com/venect_qiita/items/6e44848bd2cf3a25c250 株式会社 2022-07-25 18:58:20
python Pythonタグが付けられた新着投稿 - Qiita 第2回|エンティティ分析の方法と成果物を説明する https://qiita.com/venect_qiita/items/2b036a95ab4a4b30d53e 株式会社 2022-07-25 18:57:03
python Pythonタグが付けられた新着投稿 - Qiita 第1回|GCPの「Cloud Natural Language」の概要と活用方法 https://qiita.com/venect_qiita/items/1fdf5b8377f79f9a9910 cloudnatural 2022-07-25 18:54:15
AWS AWSタグが付けられた新着投稿 - Qiita AWS-SAA 合格体験記 https://qiita.com/Naga_Ayuu/items/d7fbfa4b18745fcbc491 sarchitectassociate 2022-07-25 18:19:39
GCP gcpタグが付けられた新着投稿 - Qiita 第4回|構文解析の方法と成果物を説明する https://qiita.com/venect_qiita/items/71409bcef2caac9f2843 位置づけ 2022-07-25 18:59:28
GCP gcpタグが付けられた新着投稿 - Qiita 第3回|感情分析の方法と成果物を説明する https://qiita.com/venect_qiita/items/6e44848bd2cf3a25c250 株式会社 2022-07-25 18:58:20
GCP gcpタグが付けられた新着投稿 - Qiita 第2回|エンティティ分析の方法と成果物を説明する https://qiita.com/venect_qiita/items/2b036a95ab4a4b30d53e 株式会社 2022-07-25 18:57:03
GCP gcpタグが付けられた新着投稿 - Qiita 第1回|GCPの「Cloud Natural Language」の概要と活用方法 https://qiita.com/venect_qiita/items/1fdf5b8377f79f9a9910 cloudnatural 2022-07-25 18:54:15
技術ブログ Developers.IO MITRE ATT&CKから読み解くパスワードスプレー攻撃対策 https://dev.classmethod.jp/articles/mitre-attck_password-spray/ mitreattampck 2022-07-25 09:35:34
技術ブログ Developers.IO .NET アプリケーションの AWS 移行 #devio2022 https://dev.classmethod.jp/articles/devio2022-aws-migration-of-dotnet-applications/ developersio 2022-07-25 09:14:35
技術ブログ Developers.IO 【開催迫る】7/26(火)開幕!視聴者参加型のプレゼント企画あり「DevelopersIO 2022 ~技術で心揺さぶる3日間~」を開催します https://dev.classmethod.jp/news/devio-2022/ developersio 2022-07-25 09:00:46
海外TECH DEV Community Stateless, Secretless Multi-cluster Monitoring in Azure Kubernetes Service with Thanos, Prometheus and Azure Managed Grafana https://dev.to/ams0/stateless-secretless-multi-cluster-monitoring-in-azure-kubernetes-service-with-thanos-prometheus-and-azure-managed-grafana-37jg Stateless Secretless Multi cluster Monitoring in Azure Kubernetes Service with Thanos Prometheus and Azure Managed Grafana IntroductionObservability is paramount to every distributed system and it s becoming increasingly complicated in a cloud native world where we might deploy multiple ephemeral clusters and we want to keep their metrics beyond their lifecycle span This article aims at cloud native engineers that face the challenge of observing multiple Azure Kubernetes Clusters AKS and need a flexible stateless solution leveraging available and cost effective blob storage for long term retention of metrics one which does not require injecting static secrets to access the storage as it leverage the native Azure Managed Identities associated with the cluster This solution builds upon well established Cloud Native Computing Foundation CNCF open source projects like Thanos and Prometheus together with a new managed services Azure Managed Grafana recently released in public preview It allows for ephemeral clusters to still have updated metrics without the hours local storage of metrics in the classic deployment of Thanos sidecar to Prometheus This article was inspired by several sources most importantly this two articles Using Azure Kubernetes Service with Grafana and Prometheus and Store Prometheus Metrics with Thanos Azure Storage and Azure Kubernetes Service on Microsoft Techcommunity blog PrerequisitesAn or AKS cluster with either a user managed identity assigned to the kubelet identity or system assigned identityAbility to assign roles on Azure resources User Access Administrator role A storage account Recommended A public DNS zone in AzureAzure CLIHelm CLI ArchitectureWe will deploy all components of Thanos and Prometheus in a single cluster but since they are couple only via the ingress they don t need to be co located Cluster wide servicesFor Thanos receive and query components to be available outside the cluster and secured with TLS we will need ingress nginx and cert manager For ingress deploy the Helm chart using the following command to account for this issue with AKS clusters gt helm upgrade install ingress nginx ingress nginx repo set controller service annotations service beta kubernetes io azure load balancer health probe request path healthz set controller service externalTrafficPolicy Local namespace ingress nginx create namespaceNotice the extra annotations and the externalTrafficPolicy set to Local Next we need cert manager to automatically provision SSL certificates from Let s Encrypt we will just need a valid email address for the ClusterIssuer helm upgrade i cert manager namespace cert manager create namespace set installCRDs true set ingressShim defaultIssuerName letsencrypt prod set ingressShim defaultIssuerKind ClusterIssuer repo cert managerkubectl apply f lt lt EOFapiVersion cert manager io vkind ClusterIssuermetadata name letsencrypt prodspec acme email email email com server privateKeySecretRef name letsencrypt prod solvers http ingress class nginxEOFLast but not least we will add a DNS record for our ingress Loadbalancer IP so it will be seamless to get public FQDNs for our endpoints for Thanos receive and Thanos Query az network dns record set a add record n thanos g dns z cookingwithazure com ipv address kubectl get svc ingress nginx controller n ingress nginx o jsonpath status loadBalancer ingress ip Note how we use kubectl with jsonpath type output to get the ingress public IP We can now leverage the wildcard FQDN thanos cookingwithazure com in our ingresses and cert manager will be able to obtain the relative certificate seamlessly Storage account preparationBecause we do not want to store any secret or service principal in cluster we will leverage the Managed Identities assigned to the cluster and assign the relevant Azure Roles to the storage account Once you have created or identified the storage account to use and created a container within it to store the Thanos metrics assign the roles using the azure cli first determine the clientID of the managed identity clientid az aks show g lt rg gt n lt cluster name gt o json query identityProfile kubeletidentity clientId Now assign the role of Reader and Data Access to the Storage account you need this so the cloud controller can generate access keys for the containers and the Storage Blob Data Contributor role to the container only there s no need to give this permission at the storage account level because it will enable writing to every container which we don t need Always remember to apply the principles of least privileges az role assignment create role Reader and data access assignee clientid scope subscriptions lt subID gt resourceGroups lt rg gt providers Microsoft Storage storageAccounts lt account name gt az role assignment create role Storage Blob Data Contributor assignee clientid scope subscriptions lt subID gt resourceGroups lt rg gt providers Microsoft Storage storageAccounts lt account name gt containers lt container name gt Create basic auth credentialsOk we kinda cheated in the title you do need one credential at least for this setup and it s the one to access the Prometheus API exposed by Thanos from Azure Managed Grafana We will use the same credentials but feel free to generate a different one to push metrics from Prometheus to Thanos using remote write via the ingress controller You ll need a strong password stored into a file called pass locally htpasswd c i auth thanos lt pass Create the namespaceskubectl create ns thanoskubectl create ns prometheus for Thanos Query and Receivekubectl create secret generic n thanos basic auth from file auth for Prometheus remote writekubectl create secret generic n prometheus remotewrite secret from literal user thanos from literal password cat pass We now have the secrets in place for the ingresses and for deploying Prometheus Deploying ThanosWe will use the Bitnami chart to deploy the Thanos components we need helm upgrade i thanos n monitoring create namespace values thanos values yaml bitnami thanosLet s go thru the relevant sections of the values file objstoreConfig type AZURE config storage account thanostore container thanostore endpoint blob core windows net max retries user assigned id c e cb acb eafc replace the user assigned id with the object id of your kubeletIdentity for more information about AKS identities check out this article This section instructs the Thanos Store Gateway and Compactor to use an Azure Blob store and to use the kubelet identity to access it Next we enable the ruler and the query components ruler enabled truequery enabled true queryFrontend enabled trueWe also enable autoscaling for the stateless query components the query and the query frontend the latter helps aggregating read queries and we enable simple authentication for the Query frontend service using ingress nginx annotations queryFrontend ingress enabled true annotations cert manager io cluster issuer letsencrypt prod nginx ingress kubernetes io auth type basic nginx ingress kubernetes io auth secret basic auth nginx ingress kubernetes io auth realm Authentication Required thanos hostname query thanos cookingwithazure com ingressClassName nginx tls trueThe annotation references the basic auth secret we created before from the htpasswd credentials Note that the same annotations are also under the receive section as we re using the exact same secret for pushing metrics into Thanos although with a different hostname Prometheus remote writeUntil full support for Agent mode lands in the Prometheus operator follow this issue we can use the remote write feature to ship every metrics instantly to a remote endpoint in our case represented by the Thanos Query Frontend ingress Let s start by deploying Prometheus using the kube prometheus stack helm chart helm upgrade i n prometheus promremotewrite f prom remotewrite yaml prometheus community kube prometheus stackLet s go thru the values file to explain the options we need to enable remote write prometheus enabled true prometheusSpec externalLabels datacenter westeu cluster playgroundThis enables Prometheus and attaches two extra labels to every metrics so it becomes easier to filter data coming from multiple sources clusters later in Grafana remoteWrite url name Thanos basicAuth username name remotewrite secret key user password name remotewrite secret key passwordThis section points to the remote endpoint secured via SSL using Let s Encrypt certificates thus trusted by the certificate store on the AKS nodes if you use a non trusted certificate refer to the TLSConfig section of the PrometheusSpec API Note how the credentials to access the remote endpoint are coming from the secret created beforehand and stored in the prometheus namespace Note here that although Prometheus is deployed in the same cluster as Thanos for simplicity it sends the metrics to the ingress FQDN thus it s trivial to extend this setup to multiple remote clusters and collect their metrics into a single centralized Thanos receive collector and a single blob storage with all metrics correctly tagged and identifiable Observing the stack with Azure Managed GrafanaAzure Managed Grafana AME is a new offering in the toolset of observability tools in Azure and it s based on the popular open source dashboarding system Grafana Beside out of the box integration with Azure AME is a fully functional Grafana deployment that can be used to monitor and graph different sources including Thanos and Prometheus To start head to the Azure Portal and deploy AME then get the endpoint from the Overview tab and connect to your AME instance Add a new source of type Prometheus and basic authentication the same we created before Congratulations We can now visualize the data flowing from Prometheus we only need a dashboard to properly display the data Go to on the left side navigation bar Dashboards gt Browse and click on Import import the Kubernetes Views Global ID into your Grafana and you ll be able to see the metrics from the cluster The imported dashboard has no filter for cluster or region thus will show all cluster metrics aggregated We will show in a future post how to add a variable to a Grafana dashboard to properly select and filter cluster views Future workThis setup allows for autoscaling of receiver and query frontend as horizontal pod autoscalers are deployed and associated with the Thanos components For even greater scalability and metrics isolation Thanos can be deployed multiple times each associated with different storage accounts as needed each with a different ingress to separate at the source the metrics thus appearing as separate sources in Grafana which can then be displayed in the same dashboard selecting the appropriate source for each graph and query 2022-07-25 09:15:52
海外TECH DEV Community Create a GraphQL-powered project management endpoint in Rust and MongoDB - Actix web version https://dev.to/hackmamba/create-a-graphql-powered-project-management-endpoint-in-rust-and-mongodb-actix-web-version-3j1 Create a GraphQL powered project management endpoint in Rust and MongoDB Actix web versionGraphQL is a query language for reading and manipulating data for APIs It prioritizes giving clients or servers the exact data requirement by providing a flexible and intuitive syntax to describe such data Compared to a traditional REST API GraphQL provides a type system to describe schemas for data and in turn gives consumers of the API the affordance to explore and request the needed data using a single endpoint This post will discuss building a project management application with Rust using the Async graphql library and MongoDB At the end of this tutorial we will learn how to create a GraphQL endpoint that supports reading and manipulating project management data and persisting our data using MongoDB GitHub repository can be found here PrerequisitesTo fully grasp the concepts presented in this tutorial experience with Rust is required Experience with MongoDB isn t a requirement but it s nice to have We will also be needing the following Basic knowledge of GraphQLA MongoDB account to host the database Signup is completely free Let s code Getting StartedTo get started we need to navigate to the desired directory and run the command below in our terminal cargo new project mngt rust graphql actix amp amp cd project mngt rust graphql actixThis command creates a Rust project called project mngt rust graphql actix and navigates into the project directory Next we proceed to install the required dependencies by modifying the dependencies section of the Cargo toml file as shown below other code section goes here dependencies actix web async graphql version features bson chrono async graphql actix web serde dotenv futures dependencies mongodb version actix web is a Rust based framework for building web application async graphql version features bson chrono is a server side library for building GraphQL in Rust It also features bson and chrono async graphql actix web is a library that helps integrate async grapql with actix web serde is a framework for serializing and deserializing Rust data structures E g convert Rust structs to JSON dotenv is a library for managing environment variables futures is a library for doing asynchronous programming in rust dependencies mongodb is a driver for connecting to MongoDB It also specifies the required version We need to run the command below to install the dependencies cargo build Module system in RustModules are like folder structures in our application they simplify how we manage dependencies To do this we need to navigate to the src folder and create the config handler and schemas folder with their corresponding mod rs file to manage visibility config is for modularizing configuration files handler is for modularizing GraphQL logics schemas is for modularizing GraphQL schema Adding a reference to the ModulesTo use the code in the modules we need to declare them as a module and import them into the main rs file add modules mod config mod handler mod schemas fn main println Hello world Setting up MongoDBWith that done we need to log in or sign up into our MongoDB account Click the project dropdown menu and click on the New Project button Enter the projectMngt as the project name click Next and click Create Project Click on Build a DatabaseSelect Shared as the type of database Click on Create to setup a cluster This might take sometime to setup Next we need to create a user to access the database externally by inputting the Username Password and then clicking on Create User We also need to add our IP address to safely connect to the database by clicking on the Add My Current IP Address button Then click on Finish and Close to save changes On saving the changes we should see a Database Deployments screen as shown below Connecting our application to MongoDBWith the configuration done we need to connect our application with the database created To do this click on the Connect buttonClick on Connect your application change the Driver to Rust and the Version as shown below Then click on the copy icon to copy the connection string Setup Environment VariableNext we must modify the copied connection string with the user s password we created earlier and change the database name To do this first we need to create a env file in the root directory and in this file add the snippet below MONGOURI mongodb srv lt YOUR USERNAME HERE gt lt YOUR PASSWORD HERE gt cluster eakf mongodb net lt DATABASE NAME gt retryWrites true amp w majoritySample of a properly filled connection string below MONGOURI mongodb srv malomz malomzPassword cluster eahghkf mongodb net projectMngt retryWrites true amp w majority Creating GraphQL EndpointsWith the setup done we need to create a schema to represent our application data To do this we need to navigate to the schemas folder and in this folder create a project schema rs file and add the snippet below use async graphql Enum InputObject SimpleObject use mongodb bson oid ObjectId use serde Deserialize Serialize owner schema derive Debug Clone Serialize Deserialize SimpleObject pub struct Owner serde skip serializing if Option is none pub id Option lt ObjectId gt pub name String pub email String pub phone String derive InputObject pub struct CreateOwner pub name String pub email String pub phone String derive InputObject pub struct FetchOwner pub id String project schema derive Debug Clone Copy PartialEq Eq Serialize Deserialize Enum pub enum Status NotStarted InProgress Completed derive Debug Clone Serialize Deserialize SimpleObject pub struct Project serde skip serializing if Option is none pub id Option lt ObjectId gt pub owner id String pub name String pub description String pub status Status derive InputObject pub struct CreateProject pub owner id String pub name String pub description String pub status Status derive InputObject pub struct FetchProject pub id String The snippet above does the following Imports required dependenciesUses the derive macro to generate implementation support for Owner CreateOwner FetchOwner Status Project CreateProject and FetchProject The snippet also uses the procedural macro from the serde and async graphql library to serialize deserialize and convert Rust structs to a GraphQL schema Next we must register the project schema rs file as part of the schemas module To do this open the mod rs in the schemas folder and add the snippet below pub mod project schema Database LogicWith the schema fully set up and made available to be consumed we can now create our database logic that will do the following Create project ownerGet all ownersGet a single ownerCreate projectGet all projectsGet a single projectTo do this First we need to navigate to the config folder and in this folder create a mongo rs file and add the snippet below use dotenv dotenv use futures TryStreamExt use std env io Error use mongodb bson doc oid ObjectId Client Collection Database use crate schemas project schema Owner Project pub struct DBMongo db Database impl DBMongo pub async fn init gt Self dotenv ok let uri match env var MONGOURI Ok v gt v to string Err gt format Error loading env variable let client Client with uri str uri await expect error connecting to database let db client database projectMngt DBMongo db fn col helper lt T gt data source amp Self collection name amp str gt Collection lt T gt data source db collection collection name The snippet above does the following Imports the required dependenciesCreates a DBMongo struct with a db field to access MongoDB databaseCreates an implementation block that adds methods to the DBMongo structAdds an init method to the implementation block to load the environment variable creates a connection to the database and returns an instance of the DBMongo structAdds a col helper method a helper function to create MongoDB collectionNext we need to add the remaining methods to the DBMongo implementation to cater to the project management operations imports goes here pub struct DBMongo db Database impl DBMongo pub async fn init gt Self init code goes here fn col helper lt T gt data source amp Self collection name amp str gt Collection lt T gt data source db collection collection name Owners logic pub async fn create owner amp self new owner Owner gt Result lt Owner Error gt let new doc Owner id None name new owner name clone email new owner email clone phone new owner phone clone let col DBMongo col helper lt Owner gt amp self owner let data col insert one new doc None await expect Error creating owner let new owner Owner id data inserted id as object id name new owner name clone email new owner email clone phone new owner phone clone Ok new owner pub async fn get owners amp self gt Result lt Vec lt Owner gt Error gt let col DBMongo col helper lt Owner gt amp self owner let mut cursors col find None None await expect Error getting list of owners let mut owners Vec lt Owner gt Vec new while let Some owner cursors try next await expect Error mapping through cursor owners push owner Ok owners pub async fn single owner amp self id amp String gt Result lt Owner Error gt let obj id ObjectId parse str id unwrap let filter doc id obj id let col DBMongo col helper lt Owner gt amp self owner let owner detail col find one filter None await expect Error getting owner s detail Ok owner detail unwrap project logics pub async fn create project amp self new project Project gt Result lt Project Error gt let new doc Project id None owner id new project owner id clone name new project name clone description new project description clone status new project status clone let col DBMongo col helper lt Project gt amp self project let data col insert one new doc None await expect Error creating project let new project Project id data inserted id as object id owner id new project owner id clone name new project name clone description new project description clone status new project status clone Ok new project pub async fn get projects amp self gt Result lt Vec lt Project gt Error gt let col DBMongo col helper lt Project gt amp self project let mut cursors col find None None await expect Error getting list of projects let mut projects Vec lt Project gt Vec new while let Some project cursors try next await expect Error mapping through cursor projects push project Ok projects pub async fn single project amp self id amp String gt Result lt Project Error gt let obj id ObjectId parse str id unwrap let filter doc id obj id let col DBMongo col helper lt Project gt amp self project let project detail col find one filter None await expect Error getting project s detail Ok project detail unwrap The snippet above does the following Adds a create owner method that takes in a self and new owner as parameters and returns the created owner or an error Inside the method we created a new document using the Owner struct Then we use the col helper method to create a new collection and access the insert one function to create a new owner and handle errors Finally we returned the created owner informationAdds a get owners method that takes in a self as parameters and returns the list of owners or an error Inside the method we use the col helper method to create a new collection and access the find function without any filter so that it can match all the documents inside the database and returned the list optimally using the try next method to loop through the list of owners and handle errorsAdds a single owner method that takes in a self and id as parameters and returns the owner detail or an error Inside the method we converted the id to an ObjectId and used it as a filter to get the matching document Then we use the col helper method to create a new collection and access the find one function from the collection to get the details of the owner and handle errorsAdds a create project method that takes in a self and new project as parameters and returns the created project or an error Inside the method we created a new document using the Project struct Then we use the col helper method to create a new collection and access the insert one function to create a new project and handle errors Finally we returned the created project informationAdds a get projects method that takes in a self as parameters and returns the list of projects or an error Inside the method we use the col helper method to create a new collection and access the find function without any filter so that it can match all the documents inside the database and returned the list optimally using the try next method to loop through the list of projects and handle errorsAdds a single project method that takes in a self and id as parameters and returns the project detail or an error Inside the method we converted the id to an ObjectId and used it as a filter to get the matching document Then we use the col helper method to create a new collection and access the find one function from the collection to get the details of the project and handle errorsFinally we must register the mongo rs file as part of the config module To do this open the mod rs in the config folder and add the snippet below pub mod mongo GraphQL HandlersWith the database logic sorted out we can start using them to create our GraphQL handlers To do this First we need to navigate to the handler folder and in this folder create a graphql handler rs file and add the snippet below use crate config mongo DBMongo schemas project schema CreateOwner CreateProject FetchOwner FetchProject Owner Project use async graphql Context EmptySubscription FieldResult Object Schema pub struct Query Object extends impl Query owners query async fn owner amp self ctx amp Context lt gt input FetchOwner gt FieldResult lt Owner gt let db amp ctx data unchecked lt DBMongo gt let owner db single owner amp input id await unwrap Ok owner async fn get owners amp self ctx amp Context lt gt gt FieldResult lt Vec lt Owner gt gt let db amp ctx data unchecked lt DBMongo gt let owners db get owners await unwrap Ok owners projects query async fn project amp self ctx amp Context lt gt input FetchProject gt FieldResult lt Project gt let db amp ctx data unchecked lt DBMongo gt let project db single project amp input id await unwrap Ok project async fn get projects amp self ctx amp Context lt gt gt FieldResult lt Vec lt Project gt gt let db amp ctx data unchecked lt DBMongo gt let projects db get projects await unwrap Ok projects pub struct Mutation Object impl Mutation owner mutation async fn create owner amp self ctx amp Context lt gt input CreateOwner gt FieldResult lt Owner gt let db amp ctx data unchecked lt DBMongo gt let new owner Owner id None email input email name input name phone input phone let owner db create owner new owner await unwrap Ok owner async fn create project amp self ctx amp Context lt gt input CreateProject gt FieldResult lt Project gt let db amp ctx data unchecked lt DBMongo gt let new project Project id None owner id input owner id name input name description input description status input status let project db create project new project await unwrap Ok project pub type ProjectSchema Schema lt Query Mutation EmptySubscription gt The snippet above does the following Imports the required dependenciesCreates a Query struct with implementation methods related to querying the database using the corresponding methods from the database logicCreates a Mutation struct with implementation methods related to modifying the database using the corresponding methods from the database logic Creates a ProjectSchema type to construct how our GraphQL is using the Query struct Mutation struct and EmptySubscription since we don t have any subscriptions Creating GraphQL ServerFinally we can start creating our GraphQL server by integrating the ProjectSchema and MongoDB with Actix web To do this we need to navigate to the main rs file and modify it as shown below mod config mod handler mod schemas add use actix web guard web self Data App HttpResponse HttpServer use async graphql http playground source GraphQLPlaygroundConfig EmptySubscription Schema use async graphql actix web GraphQLRequest GraphQLResponse use config mongo DBMongo use handler graphql handler Mutation ProjectSchema Query graphql entry async fn index schema Data lt ProjectSchema gt req GraphQLRequest gt GraphQLResponse schema execute req into inner await into async fn graphql playground gt HttpResponse HttpResponse Ok content type text html charset utf body playground source GraphQLPlaygroundConfig new actix web main async fn main gt std io Result lt gt connect to the data source let db DBMongo init await let schema data Schema build Query Mutation EmptySubscription data db finish HttpServer new move App new app data Data new schema data clone service web resource guard guard Post to index service web resource guard guard Get to graphql playground bind run await The snippet above does the following Imports the required dependenciesCreates an index function that uses the ProjectSchema type to create a GraphQL serverCreates a graphql playground function to create GraphiQL a GraphQL playground we can access from a browserUses the actix web main macro to run the main function asynchronously within the actix runtime The main function also does the following Creates a db variable to establish a connection to MongoDB by calling the init method and uses it to build a GraphQL dataCreates a new server using HttpServer struct that uses a closure to serve incoming requests using the App instance that accepts the GraphQL data add a Post service to manage all incoming GraphQL requests and a Get method to render the GraphiQL playgroundConfigures the server to run asynchronously and process HTTP requests on localhost With that done we can test our application by running the command below in our terminal cargo runThen navigate to on a web browser ConclusionThis post discussed how to modularize a Rust application build a GraphQL server and persist our data using MongoDB These resources might be helpful Async GraphQLActix webMongoDB Rust DriverAsync GraphQL Actix integrationBuild a REST API with Rust and MongoDBSerde Serializing and Deserializing library 2022-07-25 09:14:54
海外TECH DEV Community Create a GraphQL-powered project management endpoint in Rust and MongoDB - Rocket version https://dev.to/hackmamba/create-a-graphql-powered-project-management-endpoint-in-rust-and-mongodb-rocket-version-3m67 Create a GraphQL powered project management endpoint in Rust and MongoDB Rocket versionGraphQL is a query language for reading and manipulating data for APIs It prioritizes giving clients or servers the exact data requirement by providing a flexible and intuitive syntax to describe such data Compared to a traditional REST API GraphQL provides a type system to describe schemas for data and in turn gives consumers of the API the affordance to explore and request the needed data using a single endpoint This post will discuss building a project management application with Rust using the Async graphql library and MongoDB At the end of this tutorial we will learn how to create a GraphQL endpoint that supports reading and manipulating project management data and persisting our data using MongoDB GitHub repository can be found here PrerequisitesTo fully grasp the concepts presented in this tutorial experience with Rust is required Experience with MongoDB isn t a requirement but it s nice to have We will also be needing the following Basic knowledge of GraphQLA MongoDB account to host the database Signup is completely free Let s code Getting StartedTo get started we need to navigate to the desired directory and run the command below in our terminal cargo new project mngt rust graphql rocket amp amp cd project mngt rust graphql rocketThis command creates a Rust project called project mngt rust graphql rocket and navigates into the project directory Next we proceed to install the required dependencies by modifying the dependencies section of the Cargo toml file as shown below other code section goes here dependencies rocket version rc features json async graphql version features bson chrono async graphql rocket serde dotenv dependencies mongodb version default features false features sync rocket version rc features json is a Rust based framework for building web applications It also specifies the required version and the feature type json async graphql version features bson chrono is a server side library for building GraphQL in Rust It also features bson and chrono async graphql rocket is a library that helps integrate async grapql with Rocket serde is a framework for serializing and deserializing Rust data structures E g convert Rust structs to JSON dotenv is a library for managing environment variables dependencies mongodb is a driver for connecting to MongoDB It also specifies the required version and the feature type Sync API We need to run the command below to install the dependencies cargo build Module system in RustModules are like folder structures in our application they simplify how we manage dependencies To do this we need to navigate to the src folder and create the config handler and schemas folder with their corresponding mod rs file to manage visibility config is for modularizing configuration files handler is for modularizing GraphQL logics schemas is for modularizing GraphQL schema Adding a reference to the ModulesTo use the code in the modules we need to declare them as a module and import them into the main rs file add modules mod config mod handler mod schemas fn main println Hello world Setting up MongoDBWith that done we need to log in or sign up into our MongoDB account Click the project dropdown menu and click on the New Project button Enter the projectMngt as the project name click Next and click Create Project Click on Build a DatabaseSelect Shared as the type of database Click on Create to setup a cluster This might take sometime to setup Next we need to create a user to access the database externally by inputting the Username Password and then clicking on Create User We also need to add our IP address to safely connect to the database by clicking on the Add My Current IP Address button Then click on Finish and Close to save changes On saving the changes we should see a Database Deployments screen as shown below Connecting our application to MongoDBWith the configuration done we need to connect our application with the database created To do this click on the Connect buttonClick on Connect your application change the Driver to Rust and the Version as shown below Then click on the copy icon to copy the connection string Setup Environment VariableNext we must modify the copied connection string with the user s password we created earlier and change the database name To do this first we need to create a env file in the root directory and in this file add the snippet below MONGOURI mongodb srv lt YOUR USERNAME HERE gt lt YOUR PASSWORD HERE gt cluster eakf mongodb net lt DATABASE NAME gt retryWrites true amp w majoritySample of a properly filled connection string below MONGOURI mongodb srv malomz malomzPassword cluster eahghkf mongodb net projectMngt retryWrites true amp w majority Creating GraphQL EndpointsWith the setup done we need to create a schema to represent our application data To do this we need to navigate to the schemas folder and in this folder create a project schema rs file and add the snippet below use async graphql Enum InputObject SimpleObject use mongodb bson oid ObjectId use serde Deserialize Serialize owner schema derive Debug Clone Serialize Deserialize SimpleObject pub struct Owner serde skip serializing if Option is none pub id Option lt ObjectId gt pub name String pub email String pub phone String derive InputObject pub struct CreateOwner pub name String pub email String pub phone String derive InputObject pub struct FetchOwner pub id String project schema derive Debug Clone Copy PartialEq Eq Serialize Deserialize Enum pub enum Status NotStarted InProgress Completed derive Debug Clone Serialize Deserialize SimpleObject pub struct Project serde skip serializing if Option is none pub id Option lt ObjectId gt pub owner id String pub name String pub description String pub status Status derive InputObject pub struct CreateProject pub owner id String pub name String pub description String pub status Status derive InputObject pub struct FetchProject pub id String The snippet above does the following Imports required dependenciesUses the derive macro to generate implementation support for Owner CreateOwner FetchOwner Status Project CreateProject and FetchProject The snippet also uses the procedural macro from the serde and async graphql library to serialize deserialize and convert Rust structs to a GraphQL schema Next we must register the project schema rs file as part of the schemas module To do this open the mod rs in the schemas folder and add the snippet below pub mod project schema Database LogicWith the schema fully set up and made available to be consumed we can now create our database logic that will do the following Create project ownerGet all ownersGet a single ownerCreate projectGet all projectsGet a single projectTo do this First we need to navigate to the config folder and in this folder create a mongo rs file and add the snippet below use dotenv dotenv use std env io Error use mongodb bson doc oid ObjectId sync Client Collection Database use crate schemas project schema Owner Project pub struct DBMongo db Database impl DBMongo pub fn init gt Self dotenv ok let uri match env var MONGOURI Ok v gt v to string Err gt format Error loading env variable let client Client with uri str uri unwrap let db client database projectMngt DBMongo db fn col helper lt T gt data source amp Self collection name amp str gt Collection lt T gt data source db collection collection name The snippet above does the following Imports the required dependenciesCreates a DBMongo struct with a db field to access MongoDB databaseCreates an implementation block that adds methods to the DBMongo structAdds an init method to the implementation block to load the environment variable creates a connection to the database and returns an instance of the DBMongo structAdds a col helper method a helper function to create MongoDB collectionNext we need to add the remaining methods to the DBMongo implementation to cater to the project management operations imports goes here pub struct DBMongo db Database impl DBMongo pub fn init gt Self init code goes here fn col helper lt T gt data source amp Self collection name amp str gt Collection lt T gt data source db collection collection name Owners logic pub fn create owner amp self new owner Owner gt Result lt Owner Error gt let new doc Owner id None name new owner name clone email new owner email clone phone new owner phone clone let col DBMongo col helper lt Owner gt amp self owner let data col insert one new doc None ok expect Error creating owner let new owner Owner id data inserted id as object id name new owner name clone email new owner email clone phone new owner phone clone Ok new owner pub fn get owners amp self gt Result lt Vec lt Owner gt Error gt let col DBMongo col helper lt Owner gt amp self owner let cursors col find None None ok expect Error getting list of owners let owners Vec lt Owner gt cursors map doc doc unwrap collect Ok owners pub fn single owner amp self id amp String gt Result lt Owner Error gt let obj id ObjectId parse str id unwrap let filter doc id obj id let col DBMongo col helper lt Owner gt amp self owner let owner detail col find one filter None ok expect Error getting owner s detail Ok owner detail unwrap project logics pub fn create project amp self new project Project gt Result lt Project Error gt let new doc Project id None owner id new project owner id clone name new project name clone description new project description clone status new project status clone let col DBMongo col helper lt Project gt amp self project let data col insert one new doc None ok expect Error creating project let new project Project id data inserted id as object id owner id new project owner id clone name new project name clone description new project description clone status new project status clone Ok new project pub fn get projects amp self gt Result lt Vec lt Project gt Error gt let col DBMongo col helper lt Project gt amp self project let cursors col find None None ok expect Error getting list of projects let projects Vec lt Project gt cursors map doc doc unwrap collect Ok projects pub fn single project amp self id amp String gt Result lt Project Error gt let obj id ObjectId parse str id unwrap let filter doc id obj id let col DBMongo col helper lt Project gt amp self project let project detail col find one filter None ok expect Error getting project s detail Ok project detail unwrap The snippet above does the following Adds a create owner method that takes in a self and new owner as parameters and returns the created owner or an error Inside the method we created a new document using the Owner struct Then we use the col helper method to create a new collection and access the insert one function to create a new owner and handle errors Finally we returned the created owner informationAdds a get owners method that takes in a self as parameters and returns the list of owners or an error Inside the method we use the col helper method to create a new collection and access the find function without any filter so that it can match all the documents inside the database and returned the list optimally using the map method to loop through the list of owners and handle errorsAdds a single owner method that takes in a self and id as parameters and returns the owner detail or an error Inside the method we converted the id to an ObjectId and used it as a filter to get the matching document Then we use the col helper method to create a new collection and access the find one function from the collection to get the details of the owner and handle errorsAdds a create project method that takes in a self and new project as parameters and returns the created project or an error Inside the method we created a new document using the Project struct Then we use the col helper method to create a new collection and access the insert one function to create a new project and handle errors Finally we returned the created project informationAdds a get projects method that takes in a self as parameters and returns the list of projects or an error Inside the method we use the col helper method to create a new collection and access the find function without any filter so that it can match all the documents inside the database and returned the list optimally using the map method to loop through the list of projects and handle errorsAdds a single project method that takes in a self and id as parameters and returns the project detail or an error Inside the method we converted the id to an ObjectId and used it as a filter to get the matching document Then we use the col helper method to create a new collection and access the find one function from the collection to get the details of the project and handle errorsFinally we must register the mongo rs file as part of the config module To do this open the mod rs in the config folder and add the snippet below pub mod mongo GraphQL HandlersWith the database logic sorted out we can start using them to create our GraphQL handlers To do this First we need to navigate to the handler folder and in this folder create a graphql handler rs file and add the snippet below use crate config mongo DBMongo schemas project schema CreateOwner CreateProject FetchOwner FetchProject Owner Project use async graphql Context EmptySubscription FieldResult Object Schema pub struct Query Object extends impl Query owners query async fn owner amp self ctx amp Context lt gt input FetchOwner gt FieldResult lt Owner gt let db amp ctx data unchecked lt DBMongo gt let owner db single owner amp input id unwrap Ok owner async fn get owners amp self ctx amp Context lt gt gt FieldResult lt Vec lt Owner gt gt let db amp ctx data unchecked lt DBMongo gt let owners db get owners unwrap Ok owners projects query async fn project amp self ctx amp Context lt gt input FetchProject gt FieldResult lt Project gt let db amp ctx data unchecked lt DBMongo gt let project db single project amp input id unwrap Ok project async fn get projects amp self ctx amp Context lt gt gt FieldResult lt Vec lt Project gt gt let db amp ctx data unchecked lt DBMongo gt let projects db get projects unwrap Ok projects pub struct Mutation Object impl Mutation owner mutation async fn create owner amp self ctx amp Context lt gt input CreateOwner gt FieldResult lt Owner gt let db amp ctx data unchecked lt DBMongo gt let new owner Owner id None email input email name input name phone input phone let owner db create owner new owner unwrap Ok owner async fn create project amp self ctx amp Context lt gt input CreateProject gt FieldResult lt Project gt let db amp ctx data unchecked lt DBMongo gt let new project Project id None owner id input owner id name input name description input description status input status let project db create project new project unwrap Ok project pub type ProjectSchema Schema lt Query Mutation EmptySubscription gt The snippet above does the following Imports the required dependenciesCreates a Query struct with implementation methods related to querying the database using the corresponding methods from the database logicCreates a Mutation struct with implementation methods related to modifying the database using the corresponding methods from the database logic Creates a ProjectSchema type to construct how our GraphQL is using the Query struct Mutation struct and EmptySubscription since we don t have any subscriptions Creating GraphQL ServerFinally we can start creating our GraphQL server by integrating the ProjectSchema and MongoDB with Actix web To do this we need to navigate to the main rs file and modify it as shown below mod config mod handler mod schemas add use async graphql http playground source GraphQLPlaygroundConfig EmptySubscription Schema use async graphql rocket GraphQLQuery GraphQLRequest GraphQLResponse use config mongo DBMongo use handler graphql handler Mutation ProjectSchema Query use rocket response content routes State rocket get graphql lt query gt async fn graphql query schema amp State lt ProjectSchema gt query GraphQLQuery gt GraphQLResponse query execute schema await rocket post graphql data lt request gt format application json async fn graphql mutation schema amp State lt ProjectSchema gt request GraphQLRequest gt GraphQLResponse request execute schema await rocket get async fn graphql playground gt content RawHtml lt String gt content RawHtml playground source GraphQLPlaygroundConfig new graphql rocket launch fn rocket gt let db DBMongo init let schema Schema build Query Mutation EmptySubscription data db finish rocket build manage schema mount routes graphql query graphql mutation graphql playground The snippet above does the following Imports the required dependenciesCreates a graphql query function with get procedural macro to specify the GraphQL route and uses the ProjectSchema type to execute methods related to querying the databaseCreates a graphql mutation function with post procedural macro to specify the GraphQL route and uses the ProjectSchema type to execute methods related to modifying the databaseCreates a graphql playground function to create GraphiQL a GraphQL playground we can access from a browserUses the rocket launch macro to run the rocket function that generates an application entry point and runs the server The rocket also function does the following Creates a db variable to establish a connection to MongoDB by calling the init method and uses it to build a GraphQL dataBuilds the application using the build function adds the schema to state and configures the route to include the graphql query graphql mutation and graphql playgroundWith that done we can test our application by running the command below in our terminal cargo runThen navigate to on a web browser ConclusionThis post discussed how to modularize a Rust application build a GraphQL server and persist our data using MongoDB These resources might be helpful Async GraphQLRocketMongoDB Rust DriverBuild a REST API with Rust and MongoDBAsync GraphQL Rocket integrationSerde Serializing and Deserializing library 2022-07-25 09:14:46
海外TECH DEV Community Create a GraphQL-powered project management endpoint in Golang and MongoDB https://dev.to/hackmamba/create-a-graphql-powered-project-management-endpoint-in-golang-and-mongodb-18a Create a GraphQL powered project management endpoint in Golang and MongoDBGraphQL is a query language for reading and manipulating data for APIs It prioritizes giving clients or servers the exact data requirement by providing a flexible and intuitive syntax to describe such data Compared to a traditional REST API GraphQL provides a type system to describe schemas for data and in turn gives consumers of the API the affordance to explore and request the needed data using a single endpoint This post will discuss building a project management application with Golang using the gqlgen library and MongoDB At the end of this tutorial we will learn how to create a GraphQL endpoint that supports reading and manipulating project management data and persisting our data using MongoDB GitHub repository can be found here PrerequisitesTo fully grasp the concepts presented in this tutorial experience with Golang is required Experience with MongoDB isn t a requirement but it s nice to have We will also be needing the following Basic knowledge of GraphQLA MongoDB account to host database Signup is completely free Let s code Getting StartedTo get started we need to navigate to the desired directory and run the command below in our terminal mkdir project mngt golang graphql amp amp cd project mngt golang graphqlThis command creates a project mngt golang graphql folder and navigates into the project directory Next we need to initialize a Go module to manage project dependencies by running the command below go mod init project mngt golang graphqlThis command will create a go mod file for tracking project dependencies We proceed to install the required dependencies with go get github com designs gqlgen go mongodb org mongo driver mongo github com joho godotenvgithub com designs gqlgen is a library for creating GraphQL applications in Go go mongodb org mongo driver mongo is a driver for connecting to MongoDB github com joho godotenv is a library for managing environment variable Project InitializationThe gqlgen library uses a schema first approach it lets us define our APIs using GraphQL s Schema Definition Language The library also lets us focus on implementation by generating a project boilerplate To generate the project boilerplate we need to run the command below go run github com designs gqlgen initThe command above generates the following files gqlgen yml A file for configuring gqlgengraph generated generated go A file containing all the codes gqlgen autogenerates during execution We don t need to edit this file graph model models gen go A file containing generated models required to build the GraphQL This file is also autogenerated by gqlgen graph schema graphqls a file for defining our schemas graph schema resolvers go A file to define our application logic server go This file is our application entry point PS We might get an error about missing dependencies We can fix this by reinstalling the packages we installed earlier go get github com designs gqlgen go mongodb org mongo driver mongo github com joho godotenv Setting up MongoDBWith that done we need to log in or sign up into our MongoDB account Click the project dropdown menu and click on the New Project button Enter the projectMngt as the project name click Next and click Create Project Click on Build a DatabaseSelect Shared as the type of database Click on Create to setup a cluster This might take sometime to setup Next we need to create a user to access the database externally by inputting the Username Password and then clicking on Create User We also need to add our IP address to safely connect to the database by clicking on the Add My Current IP Address button Then click on Finish and Close to save changes On saving the changes we should see a Database Deployments screen as shown below Connecting our application to MongoDBWith the configuration done we need to connect our application with the database created To do this click on the Connect buttonClick on Connect your application change the Driver to Go and the Version as shown below Then click on the copy icon to copy the connection string Setup Environment VariableNext we must modify the copied connection string with the user s password we created earlier and change the database name To do this first we need to create a env file in the root directory and in this file add the snippet below MONGOURI mongodb srv lt YOUR USERNAME HERE gt lt YOUR PASSWORD HERE gt cluster eakf mongodb net lt DATABASE NAME gt retryWrites true amp w majoritySample of a properly filled connection string below MONGOURI mongodb srv malomz malomzPassword cluster eahghkf mongodb net projectMngt retryWrites true amp w majorityLoad Environment VariableWith that done we need to create a helper function to load the environment variable using the github com joho godotenv library we installed earlier To do this we need to create a configs folder in the root directory here create an env go file and add the snippet below package configsimport log os github com joho godotenv func EnvMongoURI string err godotenv Load if err nil log Fatal Error loading env file return os Getenv MONGOURI The snippet above does the following Import the required dependencies Create an EnvMongoURI function that checks if the environment variable is correctly loaded and returns the environment variable Defining our SchemaTo do this we need to navigate the graph folder and in this folder update the schema graphqls file as shown below type Owner id String name String email String phone String type Project id String ownerId ID name String description String status Status enum Status NOT STARTED IN PROGRESS COMPLETED input FetchOwner id String input FetchProject id String input NewOwner name String email String phone String input NewProject ownerId ID name String description String status Status type Query owners Owner projects Project owner input FetchOwner Owner project input FetchProject Project type Mutation createProject input NewProject Project createOwner input NewOwner Owner The snippet above defines the schema we need for our API by creating two types a Project and an Owner We also define Query to perform operations on the types inputs to define creation properties and Mutation for creating Project and Owner Creating application logicNext we need to generate logic for our newly created Schema using the gqlgen library To do this we need to run the command below in our terminal go run github com designs gqlgen generateOn running the command above we will get errors about Todo models missing in the schema resolvers go file this is because we changed the default model We can fix the error by deleting the CreateTodo and Todo functions Our code should look like the snippet below after the deletion package graph This file will be automatically regenerated based on the schema any resolver implementations will be copied through when generating and any unknown code will be moved to the end import context fmt project mngt golang graphql graph generated project mngt golang graphql graph model CreateProject is the resolver for the createProject field func r mutationResolver CreateProject ctx context Context input model NewProject model Project error panic fmt Errorf not implemented CreateOwner is the resolver for the createOwner field func r mutationResolver CreateOwner ctx context Context input model NewOwner model Owner error panic fmt Errorf not implemented Owners is the resolver for the owners field func r queryResolver Owners ctx context Context model Owner error panic fmt Errorf not implemented Projects is the resolver for the projects field func r queryResolver Projects ctx context Context model Project error panic fmt Errorf not implemented Owner is the resolver for the owner field func r queryResolver Owner ctx context Context input model FetchOwner model Owner error panic fmt Errorf not implemented Project is the resolver for the project field func r queryResolver Project ctx context Context input model FetchProject model Project error panic fmt Errorf not implemented Mutation returns generated MutationResolver implementation func r Resolver Mutation generated MutationResolver return amp mutationResolver r Query returns generated QueryResolver implementation func r Resolver Query generated QueryResolver return amp queryResolver r type mutationResolver struct Resolver type queryResolver struct Resolver Creating Database LogicsWith the GraphQL logic generated we need to create the code s corresponding database logic To do this we need to navigate to the configs folder here create a db go file and add the snippet below package configs import context fmt log project mngt golang graphql graph model time go mongodb org mongo driver bson go mongodb org mongo driver bson primitive go mongodb org mongo driver mongo go mongodb org mongo driver mongo options type DB struct client mongo Client func ConnectDB DB client err mongo NewClient options Client ApplyURI EnvMongoURI if err nil log Fatal err ctx context WithTimeout context Background time Second err client Connect ctx if err nil log Fatal err ping the database err client Ping ctx nil if err nil log Fatal err fmt Println Connected to MongoDB return amp DB client client func colHelper db DB collectionName string mongo Collection return db client Database projectMngt Collection collectionName func db DB CreateProject input model NewProject model Project error collection colHelper db project ctx cancel context WithTimeout context Background time Second defer cancel res err collection InsertOne ctx input if err nil return nil err project amp model Project ID res InsertedID primitive ObjectID Hex OwnerID input OwnerID Name input Name Description input Description Status model StatusNotStarted return project err func db DB CreateOwner input model NewOwner model Owner error collection colHelper db owner ctx cancel context WithTimeout context Background time Second defer cancel res err collection InsertOne ctx input if err nil return nil err owner amp model Owner ID res InsertedID primitive ObjectID Hex Name input Name Email input Email Phone input Phone return owner err func db DB GetOwners model Owner error collection colHelper db owner ctx cancel context WithTimeout context Background time Second var owners model Owner defer cancel res err collection Find ctx bson M if err nil return nil err defer res Close ctx for res Next ctx var singleOwner model Owner if err res Decode amp singleOwner err nil log Fatal err owners append owners singleOwner return owners err func db DB GetProjects model Project error collection colHelper db project ctx cancel context WithTimeout context Background time Second var projects model Project defer cancel res err collection Find ctx bson M if err nil return nil err defer res Close ctx for res Next ctx var singleProject model Project if err res Decode amp singleProject err nil log Fatal err projects append projects singleProject return projects err func db DB SingleOwner ID string model Owner error collection colHelper db owner ctx cancel context WithTimeout context Background time Second var owner model Owner defer cancel objId primitive ObjectIDFromHex ID err collection FindOne ctx bson M id objId Decode amp owner return owner err func db DB SingleProject ID string model Project error collection colHelper db project ctx cancel context WithTimeout context Background time Second var project model Project defer cancel objId primitive ObjectIDFromHex ID err collection FindOne ctx bson M id objId Decode amp project return project err The snippet above does the following Imports the required dependenciesCreate a DB struct with a client field to access MongoDB Creates a ConnectDB function that first configures the client to use the correct URI and check for errors Secondly we defined a timeout of seconds we wanted to use when trying to connect Thirdly check if there is an error while connecting to the database and cancel the connection if the connecting period exceeds seconds Finally we pinged the database to test our connection and returned a pointer to the DB struct Creates a colHelper function to create a collection Creates a CreateProject function that takes the DB struct as a pointer receiver and returns either the created Project or Error Inside the function we also created a project collection defined a timeout of seconds when inserting data into the collection and used the InsertOne function to insert the input Creates a CreateOwner function that takes the DB struct as a pointer receiver and returns either the created Owner or Error Inside the function we also created an owner collection defined a timeout of seconds when inserting data into the collection and used the InsertOne function to insert the input Creates a GetOwners function that takes the DB struct as a pointer receiver and returns either the list of Owners or Error The function follows the previous steps by getting the list of owners using the Find function We also read the retuned list optimally using the Next attribute method to loop through the returned list of owners Creates a GetProjects function that takes the DB struct as a pointer receiver and returns either the list of Projects or Error The function follows the previous steps by getting the list of projects using the Find function We also read the retuned list optimally using the Next attribute method to loop through the returned list of projects Creates a SingleOwner function that takes the DB struct as a pointer receiver and returns either the matched Owner using the FindOne function or Error Creates a SingleProject function that takes the DB struct as a pointer receiver and returns either the matched Project using the FindOne function or Error Updating the Application LogicNext we need to update the application logic with the database functions To do this we need to update the schema resolvers go file as shown below package graph This file will be automatically regenerated based on the schema any resolver implementations will be copied through when generating and any unknown code will be moved to the end import context project mngt golang graphql configs add this project mngt golang graphql graph generated project mngt golang graphql graph model add this var db configs ConnectDB CreateProject is the resolver for the createProject field func r mutationResolver CreateProject ctx context Context input model NewProject model Project error modify here project err db CreateProject amp input return project err CreateOwner is the resolver for the createOwner field func r mutationResolver CreateOwner ctx context Context input model NewOwner model Owner error modify here owner err db CreateOwner amp input return owner err Owners is the resolver for the owners field func r queryResolver Owners ctx context Context model Owner error modify here owners err db GetOwners return owners err Projects is the resolver for the projects field func r queryResolver Projects ctx context Context model Project error modify here projects err db GetProjects return projects err Owner is the resolver for the owner field func r queryResolver Owner ctx context Context input model FetchOwner model Owner error modify here owner err db SingleOwner input ID return owner err Project is the resolver for the project field func r queryResolver Project ctx context Context input model FetchProject model Project error modify here project err db SingleProject input ID return project err Mutation returns generated MutationResolver implementation func r Resolver Mutation generated MutationResolver return amp mutationResolver r Query returns generated QueryResolver implementation func r Resolver Query generated QueryResolver return amp queryResolver r type mutationResolver struct Resolver type queryResolver struct Resolver The snippet above does the following Imports the required dependencyCreates a db variable to initialize the MongoDB using ConnectDB function Modifies the CreateProject CreateOwner Owners Projects Owner and Project function using their corresponding function from the database logic Finally we need to modify the generated model IDs in the models gen go file with a bson id struct tags We use the struct tags to reformat the JSON id returned by MongoDB The remaining part of the code goes here type FetchOwner struct ID string json id bson id modify here type FetchProject struct ID string json id bson id modify here type NewOwner struct code goes here type NewProject struct code goes here type Owner struct ID string json id bson id modify here Name string json name Email string json email Phone string json phone type Project struct ID string json id bson id modify here OwnerID string json ownerId Name string json name Description string json description Status Status json status The remaining part of the code goes hereWith that done we can start a development server using the command below go run server goThen navigate to on a web browser We can also validate the operation on MongoDB ConclusionThis post discussed how to build a project management application with Golang using the gqlgen library and MongoDB These resources might be helpful GraphQL official pagegqlgen GraphQL libraryMongoDB Go DriverBuild a REST API with Golang and MongoDB 2022-07-25 09:14:15
海外TECH DEV Community Bitcoin Halving and The Future of Bitcoin. https://dev.to/chizobaonorh/bitcoin-halving-and-the-future-of-bitcoin-4d85 Bitcoin Halving and The Future of Bitcoin What is Bitcoin Halving You probably saw the title and the first thing that came to mind was Bitcoin halving Yeah that s simple cutting bitcoin into half But still you kept on reading why Curiosity Why would bitcoin be halved What does this even mean Well today I m going to be your tour guide taking you on a trip to the past to where it all began so grab some popcorn and let s get started How Bitcoin Halving came about On the rd of January Satoshi Nakamoto created the first ever digital currency called Bitcoin On creation there was a limited amount of million bitcoins created The first mining took place on that same day with over million bitcoins mined into circulation At the time bitcoin had little value and in order to grow into prominence and quantity the Proof of Work Consensus Mechanism was adopted This system involved miners solving a hash puzzleーcomplex math problems generated by the blockchainーand creating new blocks whenever a hash was solved The estimated time of new block creation was mins after which a block reward of btc was given to the miner who solved the hash This was the perfect incentive to have more miners and increase the circulation of bitcoins But unlike fiat currenciesーlike USD and Euroーbitcoin has a limited supply With a system that gives btc for every block creation a drop in value would be inevitable because much supply of a particular asset reduces demand and makes the asset worthless This brought about the need for bitcoin halving This method was configured into the blockchain s source code to reduce the number of bitcoin miners earned by half after every successful mining of blocks which is estimated to be a Four year period Occurrences of Bitcoin Halving The first halving This took place in November after the th block had been mined the block reward was reduced from BTC to BTC At the time the value of Bitcoin to USD was After the halving bitcoin s value saw a rise to its all time high of within the time before the second halving This halving was seen to propel the price of bitcoin using scarcity and little supply to create chaos for higher demand The second halving On the th of July the second halving took place after the th block had been mined Miners block reward was reduced from BTC to BTC At the time of the halving the current value of bitcoin was but on the th of December the price of bitcoin peaked at its all time highest at the timeーImagine owning bitcoins at that time that would have given you a whopping sum of This period saw investors looking into this new currency that was slowly catching the attention of the masses and the world But due to its volatility it didn t maintain this spot for long as it dipped to on the th of December The third halving The most recent halving took place on the th of May when the miners reward was slashed from BTC to btc At this time bitcoin had started getting the attention it deserved from the media investors and even organisations so a pump in the price was inevitable The coin took a triumphant entry into its all time high on the th of November at This showed the effect low supply had on demand during the halvings The next halving is estimated to occur in the year when miners block reward would be reduced from btc to btc Statistically by the year the whole volume of million bitcoins would be mined into circulation at this point there would be no block reward for miners Presently only million bitcoins are left to be mined But why would miners keep working if there are no block rewards It is speculated that bitcoin would have gained a greater standing in the market and would be a much more precious asset likened to gold after all blocks have been mined It is also well known that miners get their incentives from block rewards but rarely mentioned that they also generate income from transaction fees on the blockchain Once mining is over miners would solely generate their income from transaction fees These transaction fees on the bitcoin network are fees generated from day to day activities by traders using the blockchain to make transactions And as bitcoin increases in value so do the transaction fees From the coindesk article on bitcoin transaction fees it was calculated that The average bitcoin transaction fee is With over transactions in a day do the math What happens if of miners leave for a more lucrative blockchain The lesser the miners the more chances of a current miner earning consistently from signing and validating transactions solely But this could have a bad turn because the security of the blockchain would be in jeopardy and can be hacked as fewer people are left to validate and protect it The Future of Bitcoin after the Halving This is a question that cannot be answered with all surety but two possible outcomes could play out Positive Outcome If everything goes according to plan and speculation bitcoin would have reached a higher level of prominence where holding one bitcoin feels as though you held a dragon egg By this time companies stores people and the world at large would have started using this as a normalized form of exchange Negative Outcome We could have all just moved on from it to other altcoins with better paying value For some time now investors have been seeking out other means of digital currency that would not consume as much electricity and computing power as bitcoin does and an alternative is surfacingーThe Ethereum Coin If altcoins take the place of bitcoin there would be a drop in value and a drop out of the market making investors who invested a lot in it make huge losses from this occurrence ConclusionLike I said it is very difficult to tell how the future of bitcoin would go but in all do your research before you ape in and have a strong mind to stick to your decision and fate no matter the turn of events Thank you 2022-07-25 09:13:49
海外TECH DEV Community What does a Business Automation Manager do? https://dev.to/finnauto/what-does-a-business-automation-manager-do-3h6m What does a Business Automation Manager do When I joined FINN in April I was welcomed with incredible enthusiasm and was constantly reminded by many people what an exciting role I have Admittedly all sounded a bit strange at the beginning but now that I am three months into this job I can comfortably say they were right I am the first person officially joining FINN as a Business Automation Manager BAM a new role at FINN and a new role in the industry As the job title already indicates this position is located at the intersection of business and tech and deals with automating business processes The job combines different disciplines A BAM needs to have a good understanding of business processes Depending on the department you are supporting this can have a different focus Analytic skills are needed to comprehend and create these processes from scratch Technical expertise is required to also put them into action Generally my position acts as a point of communication between our business and tech team Understanding the requirements of the business team for possible automations is necessary to then discuss the best approach for tackling the problem with the tech team Decisions are made between a low code solution which is mostly implemented by me or a pro code approach which is code written by professional developers our Engineering team The main tool that I use is a low code tool named Make formerly Integromat It enables the user to easily create automated workflows via an intuitive UI Those workflows do not only represent business processes but also support the integration to third party providers for example email services databases This tool helped FINN to grow very quickly as time consuming manual work was automated without needing the resources of backend Engineers Discussing new Make automations with colleaguesI am supporting the Operations Development Team which is responsible for handling the timely delivery of our cars and the return process after a subscription ends Hence I am responsible for all automations that deal with the car coming from the OEM to the compound being delivered to the customer and in the end back to the compound as well as to one of our remarketing partners This comprises integrating processes with many different partners with which we maintain a close data exchange for ensuring seamless processes My daily work thus includes the following tasks Daily Tasks Creating MVPs for automating manual tasks When we are for example planning to integrate with new service partners and we need to share information low code automations provide quick solutions When the scope of the process increases we reconsider whether this is more suitable for an implementation with pro code Writing technical proposals When conceptualizing bigger projects different approaches must be planned before the implementation This means discussing and getting into feedback loops with various stakeholders that are in our case the product and the operational team A specific example is the defleeting topic cars leaving the FINN fleet that has not been supported by tech solutions as much as the infleeting processes as they happen at a later stage in our business lifecycle To create a holistic solution thorough planning is required Maintaining existing scenarios This comprises making sure automations run reliably and smoothly As FINN is growing as a company also our data transfer rate is increasing and more often we are hitting technical boundaries in the respective modules in Make To ensure that all scenarios that I am responsible for are running I built another Make scenario that checks the statuses of the automations in the last h and informs me which ones hit an error or warning Questioning the status quo If an automation is not very robust and regularly fails it is time to refactor the scenario and adapt it to low code best practices It is also possible that business circumstances or external requirements change which require updating the scenarios Participating in the low code automation squad Here we are trying to establish best practices of low code in the company That is achieved with a new onboarding for Make with hands on exercises for all new joiners By doing this we empower everyone at FINN to use our low code tools for smaller automations Documenting existing scenario landscapes with Miro to keep a good overview over them Skills that are important for the job Skills that helped me in the job are being open and communicative since I am the contact person for various stakeholders A good mix between paying attention to small details but also staying creative to consider entire processes is crucial too To implement the latter analytical thinking and a sense of pragmatism is required Generally a good understanding of business and tech basics will help you to succeed in the job Challenges in the job Starting your first job after university can be challenging Personally I was feeling overwhelmed with the amount of existing scenarios and the maintenance work that comes with them It can be demanding to reproduce someone s thoughts behind setting up an automated flow Furthermore understanding the complex business processes that stem from the integration with many different stakeholders was a challenge However building better documentation around the automations that support these business processes was a good approach for me to tackle that complexity Conclusion As a BAM you have great possibilities to develop into different directions and set the focus as you wish If you are interested in the strategic alignment of the product the position gives you that flexibility But moving in a more technical direction thus including pro code into automations and workflows is also a possibility For me the position seems like a perfect fit since in my bachelors I studied International Business Administration and I completed my Masters Degree in Information System Management Thus as a first job after university this job gives me the opportunity to explore a broad field of topics Check out the position of Business Development Manager for Automation if you want to be a part of FINN s automation journey 2022-07-25 09:04:00
医療系 医療介護 CBnews 濃厚接触者の特定は「ハイリスク施設に集中化」 https://www.cbnews.jp/news/entry/20220725180645 厚生労働省 2022-07-25 18:40:00
医療系 医療介護 CBnews 看護職員派遣への財政支援、9月末分に延長-厚労省 https://www.cbnews.jp/news/entry/20220725180344 先延ばし 2022-07-25 18:10:00
金融 RSS FILE - 日本証券業協会 上場有価証券の発行会社が発行した店頭取扱有価証券の売買状況 https://www.jsda.or.jp/shiryoshitsu/toukei/toriatsukai/index.html 店頭取扱有価証券 2022-07-25 10:00:00
金融 RSS FILE - 日本証券業協会 債券貸借取引残高等状況 (旧債券貸借取引状況) https://www.jsda.or.jp/shiryoshitsu/toukei/taishaku/index.html 貸借 2022-07-25 09:30:00
金融 RSS FILE - 日本証券業協会 証券業報 2022年7月 https://www.jsda.or.jp/about/gaiyou/gyouhou/22/2207gyouhou.html 証券 2022-07-25 09:15:00
海外ニュース Japan Times latest articles Specter of COVID-19 will hang over remainder of NPB season https://www.japantimes.co.jp/sports/2022/07/25/baseball/japanese-baseball/npb-covid-regular-season/ havoc 2022-07-25 18:30:04
海外ニュース Japan Times latest articles Back in business? The key sector missing out on Japan’s tourism reboot https://www.japantimes.co.jp/life/2022/07/25/travel/japan-international-conferences-future/ Back in business The key sector missing out on Japan s tourism rebootOrganizers of international conferences and exhibitions will need to wait a little longer before business travelers return to Japan in droves 2022-07-25 18:20:49
ニュース BBC News - Home NHS in England faces worst ever staffing crisis, MPs warn https://www.bbc.co.uk/news/health-62267282?at_medium=RSS&at_campaign=KARANGA england 2022-07-25 09:33:43
ニュース BBC News - Home Rishi Sunak and Liz Truss: Stakes high for first head-to-head debate https://www.bbc.co.uk/news/uk-politics-62272482?at_medium=RSS&at_campaign=KARANGA chris 2022-07-25 09:44:27
ニュース BBC News - Home Ryanair boss hits out at airports: 'They had one job' https://www.bbc.co.uk/news/business-62289056?at_medium=RSS&at_campaign=KARANGA sorahan 2022-07-25 09:26:59
ニュース BBC News - Home Elon Musk denies affair with Google co-founder Sergey Brin's wife https://www.bbc.co.uk/news/business-62288139?at_medium=RSS&at_campaign=KARANGA journal 2022-07-25 09:44:07
ニュース BBC News - Home Eurotunnel and Dover queues: Drivers warned of summer of Channel traffic delays https://www.bbc.co.uk/news/uk-62289432?at_medium=RSS&at_campaign=KARANGA channel 2022-07-25 09:39:11
ビジネス 不景気.com 東京のリフォーム業「エニシコーポレーション」が破産へ、負債10億円 - 不景気com https://www.fukeiki.com/2022/07/enishi-corp.html 株式会社 2022-07-25 09:44:16
ビジネス 不景気.com 北九州の「ディー・エー・ピー・テクノロジー」が特別清算、負債138億円 - 不景気com https://www.fukeiki.com/2022/07/dap-technology.html 特別清算 2022-07-25 09:25:32
北海道 北海道新聞 自民・茂木氏「野党分裂が参院選勝因」 共闘なら苦戦も https://www.hokkaido-np.co.jp/article/709921/ 東京都内 2022-07-25 18:39:46
北海道 北海道新聞 <森>大場さん、剣道全国へ 森中3年 「面さらに磨き初戦勝つ」 https://www.hokkaido-np.co.jp/article/709924/ 磨き 2022-07-25 18:38:00
北海道 北海道新聞 埼玉で9264人感染 4人取り下げ https://www.hokkaido-np.co.jp/article/709923/ 取り下げ 2022-07-25 18:37:00
北海道 北海道新聞 なでしこ、26日に中国戦 東アジアE―1最終戦 https://www.hokkaido-np.co.jp/article/709894/ 日本代表 2022-07-25 18:22:12
北海道 北海道新聞 球宴代替選手 阪神・岩崎、巨人・ウォーカーら選出 日ハム・清宮初出場、松本はけがで辞退 https://www.hokkaido-np.co.jp/article/709890/ 阪神 2022-07-25 18:37:06
北海道 北海道新聞 動物の世話で伸び伸び成長 大沼「こども園スーホ」 カリキュラムほぼなし https://www.hokkaido-np.co.jp/article/709922/ 大沼公園 2022-07-25 18:35:00
北海道 北海道新聞 安倍昭恵さん警護車両が事故 けが人なし、首都高で https://www.hokkaido-np.co.jp/article/709876/ 安倍昭恵 2022-07-25 18:22:12
北海道 北海道新聞 鉄道史、蘭越駅で触れる 貴重なグッズ展示、販売 https://www.hokkaido-np.co.jp/article/709920/ 鉄道の歴史 2022-07-25 18:30:00
北海道 北海道新聞 「断固反対で変わらず」 原発処理水で全漁連声明 https://www.hokkaido-np.co.jp/article/709919/ 全国漁業協同組合連合会 2022-07-25 18:28:00
北海道 北海道新聞 3尺玉の大輪、終息の願い おたる潮まつり閉幕 https://www.hokkaido-np.co.jp/article/709918/ 願い 2022-07-25 18:27:00
北海道 北海道新聞 物価上昇率2・6%、内閣府予測 8年ぶり高水準、円安響く https://www.hokkaido-np.co.jp/article/709917/ 物価上昇率 2022-07-25 18:26:00
北海道 北海道新聞 ウクライナ領土「割譲すべきでない」 キッシンジャー元米国務長官 https://www.hokkaido-np.co.jp/article/709911/ 国務長官 2022-07-25 18:15:00
北海道 北海道新聞 花火と夜景、クルーズ最高 ガリンコ号3、紋別港内航行 https://www.hokkaido-np.co.jp/article/709915/ 打ち上げ花火 2022-07-25 18:19:00
北海道 北海道新聞 日本髪結い上げ後世に 苫小牧の美容師・岩崎さん 工場製かつら主流で職人激減 https://www.hokkaido-np.co.jp/article/709912/ 苫小牧市住吉町 2022-07-25 18:16:00
北海道 北海道新聞 ロシアに安倍氏の国葬実施を伝達 磯崎氏「国交を有する」 https://www.hokkaido-np.co.jp/article/709910/ 記者会見 2022-07-25 18:14:00
北海道 北海道新聞 東京円、136円台前半 https://www.hokkaido-np.co.jp/article/709906/ 東京外国為替市場 2022-07-25 18:13:00
北海道 北海道新聞 大阪で7785人コロナ感染 1人死亡、9人取り下げ https://www.hokkaido-np.co.jp/article/709904/ 取り下げ 2022-07-25 18:12:00
北海道 北海道新聞 西胆振の貝塚、来場好調 縄文遺跡群の世界遺産登録26日で1年 https://www.hokkaido-np.co.jp/article/709903/ 洞爺湖町 2022-07-25 18:11:00
北海道 北海道新聞 ランキングで大阪2年連続トップ 全国138都市を評価 https://www.hokkaido-np.co.jp/article/709902/ 都市 2022-07-25 18:02:00
北海道 北海道新聞 本別町議選が告示 定数12に15人 https://www.hokkaido-np.co.jp/article/709901/ 任期満了 2022-07-25 18:01:00
ニュース Newsweek ジェイムズ・ウェッブ宇宙望遠鏡の生データをアマチュア研究者が処理した画像が圧倒的 https://www.newsweekjapan.jp/stories/world/2022/07/post-99195.php 2022-07-25 18:08:01
IT 週刊アスキー 夜の動物を見に行こう! 横浜市立金沢動物園「ナイト金沢ZOO」8月の土日祝日に開催 https://weekly.ascii.jp/elem/000/004/099/4099226/ 横浜市立金沢動物園 2022-07-25 18:30:00

コメント

このブログの人気の投稿

投稿時間:2021-06-17 05:05:34 RSSフィード2021-06-17 05:00 分まとめ(1274件)

投稿時間:2021-06-20 02:06:12 RSSフィード2021-06-20 02:00 分まとめ(3871件)

投稿時間:2020-12-01 09:41:49 RSSフィード2020-12-01 09:00 分まとめ(69件)