IT |
InfoQ |
Visual Studio 2022 17.7 Preview 3 With Productivity Updates |
https://www.infoq.com/news/2023/07/vs2022-v17-7-preview-3/?utm_campaign=infoq_content&utm_source=infoq&utm_medium=feed&utm_term=global
|
Visual Studio Preview With Productivity UpdatesMicrosoft has released the third preview of Visual Studio version Preview brings a range of improvements and features aimed at enhancing developer productivity and helping maintain clean code Preview is focused on a new tool called includes cleanup for C developers The latest version is already available for download By Almir Vuk |
2023-07-17 07:05:00 |
python |
Pythonタグが付けられた新着投稿 - Qiita |
csvファイルをxamppに上げてpythonで取得できるようにしてみた |
https://qiita.com/kinne/items/5ed9efc9646aeed0e17e
|
pandas |
2023-07-17 16:09:14 |
js |
JavaScriptタグが付けられた新着投稿 - Qiita |
【General】Cont. Example of building a solution using YouTube API |
https://qiita.com/Miki_Yokohata/items/413f2210f6208d97a741
|
youtube |
2023-07-17 16:13:55 |
AWS |
AWSタグが付けられた新着投稿 - Qiita |
Terraformを使ってAWSのRDS(Postgres)へ接続する用の踏み台サーバ(EC2)を構築しよう! |
https://qiita.com/shun198/items/97c4b71624b168e74c3f
|
rdspostgres |
2023-07-17 16:30:18 |
AWS |
AWSタグが付けられた新着投稿 - Qiita |
AWS Batch on Fargateのサーバレスなバッチ実行環境をTerraformで自動構築する(モニタリング編) |
https://qiita.com/neruneruo/items/12291a29a38800646e21
|
awsbatch |
2023-07-17 16:21:33 |
AWS |
AWSタグが付けられた新着投稿 - Qiita |
Cognitoについて理解を深める |
https://qiita.com/masakun1150/items/04d4497c8cc617e58fa3
|
cognito |
2023-07-17 16:12:35 |
golang |
Goタグが付けられた新着投稿 - Qiita |
[Azure] Container Apps 機能調査xHttpClientでリクエスト |
https://qiita.com/moff-bear/items/e5503c3f4d1d6d24c9ba
|
azure |
2023-07-17 16:24:00 |
Azure |
Azureタグが付けられた新着投稿 - Qiita |
[Azure] Container Apps 機能調査xHttpClientでリクエスト |
https://qiita.com/moff-bear/items/e5503c3f4d1d6d24c9ba
|
azure |
2023-07-17 16:24:00 |
海外TECH |
DEV Community |
AWS open source newsletter, #165 |
https://dev.to/aws/aws-open-source-newsletter-165-4k9g
|
AWS open source newsletter July th Instalment Welcome to of the AWS open source newsletter the only newsletter that brings you the best and latest open source content We have some great new projects this week including a tool for IoT developers to help you validate your SQL statements a command line interface tool for Amazon Verified Permissions an Amazon DynamoDB estimation tool and more Also featured this week is content on Apache Iceberg OpenSearch PostgreSQL Kubernetes Power Tools for AWS Lambda Spring Boot Babelfish for Aurora PostgreSQL Karpenter Apollo GraphQL JupyterHub dbt Apache Airflow Cedar and Apache Flink We have added quite a few new events this week so make sure you check those out as well as the Video section for some must watch videos this week from me at least FeedbackBefore you dive in however I need your help Please please please take minute to complete this short survey and you will forever have my gratitude Celebrating open source contributorsThe articles and projects shared in this newsletter are only possible thanks to the many contributors in open source I would like to shout out and thank those folks who really do power open source and enable us all to learn and build on top of what they have created So thank you to the following open source heroes Alina Dima Daniel Aniszkiewicz Gabe Hollombe Salman Sali Tomasz Dudek RafałMituła Rafal Gancarz Sylvia Lin Heitor Lessa Zygimantas Koncius Sudhir Gupta Alok Srivastava Wanchen Zhao Anjali Dhanerwal Mohammed Asadulla Baig Aychin Gasimov Johannes Koch David Nalley Mike Hicks Oleg Z and Mark Sailes Latest open source projectsThe great thing about open source projects is that you can review the source code If you like the look of these projects make sure you that take a look at the code and if it is useful to you get in touch with the maintainer to provide feedback suggestions or even submit a contribution The projects mentioned here do not represent any formal recommendation or endorsement I am just sharing for greater awareness as I think they look useful and interesting Toolsvalidation tool for aws iot rulesvalidation tool for aws iot rules is a tool that enables developers to test and validate IoT Rules SQL Statements without the heavy lifting of having to set up the infrastructure themselves The goal of this framework is to provide developers with a Closed box type of validation tool where only input payload SQL statement and expected output need to be configured If the Rules Engine parses the input as expected with the expected output tests will succeed if not they will fail Expected versus actual output payload will be printed for references To find out more be sure to read Alina Dima s post A Tool to Validate AWS IoT Rules SQL Statements where she dives deeper into this topic and shows you how this tool can help You can also check out this video she put together which is super helpful avp cliavp cli is a handy command line tool from AWS Community Builder Daniel Aniszkiewicz designed to interact with the AWS Verified Permissions AVP service You can use it to create manage and delete policy stores schemas and policies Comprehensive documentation and examples are just another reason why I like this project so thank you Daniel for putting this together If you like this make sure you star the project and let Daniel know for yourself WebTruss EntityFrameworkCore DynamoDbWebTruss EntityFrameworkCore DynamoDb is an entity framework like library for AWS DynamoDb for NET developers from Salman Sali His goal is to provide a very easy to use pick up and go library for dynamoDb Currently these are some of the functions that are supported FirstOrDefault Any ExecutePut ExecuteDelete PagedList ScannedList but he will be adding more functions Please feel free to raise an issue or pr for a feature He also provides an example of how to use this framework in the repo dynamodb scaling simulatordynamodb scaling simulator is a tool from my good friend and former colleague Gabe Hollombe that helps you plan appropriate configs for DynamoDB provisioned capacity with auto scaling It helps you to simulate how a provisioned capacity DynamoDB table will perform will it throttle requests or not under different auto scaling configurations It will also try to calculate the best config for you that results in the lowest cost and no throttles Demos Samples Solutions and Workshopssaas boilerplatesaas boilerplate is an opinionated full stack web app s boilerplate ready to be deployed to AWS platform It includes essential features that every SaaS application requires such as frontend backend API admin panel and workers With a scalable AWS based architecture and continuous deployment you can easily deploy multiple environments representing different stages in your pipeline It looks very comprehensive and provides lots of features including but not limited to payments authentication and authorisation notifications emails and many more aws sam swiftaws sam swift This project contains a Cookiecutter template to create a serverless application based on the AWS Serverless Application Model SAM and Swift This project is intended to be used as a template location for the SAM Command Line Utility CLI SAM generates a project on your local machine based on this template You can learn more about this by checking out the documentation Swift Server Templates bespokebespoke is an open source Personalized Marketing Platform that combines the functionality of Mailchimp Kalviyo s automation Substack s newsletter and Typeform for surveys The repo provides deployment guidelines on how to get this up and running on your AWS environment If this looks interesting to you get in touch I might be persuaded in putting together a blog post if enough people are interested Hat Tip to my colleague Gunnar for sharing this with me AWS and Community blog postsCommunity round upAs regular readers will know Apache Airflow is one of my favourite open source projects and I love reading community posts on this technology I was delighted therefore when AWS Community Builders Tomasz Dudek and RafałMituła wrote Dynamically create AWS ECS Task Definitions to simplify IaC and Semantic Versioning in AWS MWAA Apache Airflow which looks at how they were able to use Airflow orchestration capabilities together with Amazon ECS to run their data engineering workflows at scale Great post and a must read this week The post Instacart Creates a Self Serve Apache Flink Platform on Kubernetes from Rafal Gancarz pointed me back to the original post from Sylvia Lin which I missed when it was originally posted back in April In Building a Flink Self Serve Platform on Kubernetes at Scale Sylvia explains how Instacart took the decision to move to a self managed Apache Flink running on Amazon EKS and how this able to help them with their mission of providing a robust self serve Flink platform for their teams Powertools for AWS LambdaStarting from v releases Powertools for AWS Lambda builds are reproducible and signed publicly You can take a look at the documentation on the Security section of the documentation site which documents how they are using the SLSA framework to follow supply chain security best practices You can now verify releases using the SLSA verifier and check the public attestation details in the public repository on rekor This is a hot topic for all projects and Heitor has put together very detailed documentation which you can study and see how you can apply to your own projects This is essential reading this week folks so make sure you do not skips this one JupyterHubCheck out JupyterHub on EKS which provides an emerging EKS Blueprint that combines the versatility of JupyterHub with the scalability and flexibility of Kubernetes The blueprint leverages the Data on EKS project and simplifies how you can deploy this Check out the documentation and code on the GitHub repo to get started hands on dbtIn the post Configure end to end data pipelines with Etleap Amazon Redshift and dbt Zygimantas Koncius from Etleap collaborates with Sudhir Gupta from AWS to show how Etleap s end to end pipelines enable data teams to simplify their data integration and transformation workflows as well as achieve higher data freshness PostgreSQLA few posts of note this week on all things PostgreSQL Kicking things off we have Cross account Amazon Aurora PostgreSQL and Amazon RDS for PostgreSQL migration with reduced downtime using AWS DMS where Alok Srivastava and Wanchen Zhao provide a practical hands on guide on the various steps involved in migrating your Aurora PostgreSQL or RDS for PostgreSQL database from one AWS account to another hands on Following that Anjali Dhanerwal and Mohammed Asadulla Baig have put together this post Improve app performance through pipelining queries to Amazon RDS for PostgreSQL and Amazon Aurora PostgreSQL that shows you how pipelining PostgreSQL queries can help improve overall application performance by reducing query latency over the network hands on To finish up this collection we had the post Migrate PostgreSQL from Google Cloud Platform to Amazon RDS with minimal downtime that shows you a procedure to migrate the PostgreSQL database from Google Cloud Platform GCP Cloud SQL to Amazon Relational Database Service Amazon RDS for PostgreSQL The author Aychin Gasimov explores how you can the AWS Database Migration Service to enable this and how it is not limited to this one Cloud provider Other posts and quick readsMuleSoft Anypoint Runtime Fabric Deployment On Amazon EKS Anywhere looks at how to deploy MuleSoft Anypoint Runtime Fabric on Amazon EKS Anywhere to bring the power and flexibility of MuleSoft s runtime environment to on premises and edge locations hands on Apollo GraphQL Federation with AWS AppSync is an update to this post that I shared early in updated to comply with the new Apollo Federation spec vNew Solution Clickstream Analytics on AWS for Mobile and Web Applications provides a supporting blog post on the clickstream project I shared in of the newsletter that will walk you through everything you need to know to get started hands on Better Together Graviton and GP with Amazon OpenSearch Service looks at the cost savings you might see by running Amazon OpenSearch Service on Graviton based instances with gp based EBS storage compared to running OpenSearch on traditional x based instances with gp based EBS storage hands on Case StudiesLeveraging Open Source at Barclays to Enable Lambda Event Filtering with AWS Glue Schema Registry dives into how Barclays achieved AWS Glue Schema Registry integration with AWS Lambda event filtering by leveraging the open source library Best Practices for Optimizing Kubernetes Costs on AWS with StormForge and Karpenter looks at how this AWS Partner can help businesses leverage Karpenter to optimise both application and infrastructure configurations to reduce Kubernetes costs How Amazon EKS and Precisely s Geo Addressing SDK Power Real Time Decisions looks at how geo addressing capabilities can be deployed on AWS infrastructure to accelerate analysis and power real time decision making Quick updatesApache IcebergAWS Glue Crawlers now supports Apache Iceberg tables simplifying the adoption of AWS Glue Data Catalog as catalog for Iceberg tables and migrating from other Iceberg catalogs Apache Iceberg is an open source table format for data stored in data lakes that helps data engineers manage complex challenges such as managing continuously evolving data sets while maintaining query performance With today s launch you can automatically register Iceberg tables into Glue Catalog by running the Glue Crawler You can then query Glue Catalog Iceberg tables across various analytics engines and apply Lake Formation fine grained permissions when querying from Amazon Athena When migrating from other Iceberg Catalogs you can create and schedule a Glue Crawler and provide one or more Amazon S paths where the Iceberg tables are located You have the option to provide the maximum depth of S paths that the Glue Crawler can traverse With each run Glue Crawler will extract schema information and update Glue Catalog with the schema changes Glue Crawler supports schema merging across snapshots and updates the latest metadata file location in the Glue Catalog that AWS analytical engines can directly use OpenSearchYou can now run OpenSearch version in Amazon OpenSearch Service With OpenSearch we have made several improvements to observability security analytics index management and geospatial capabilities in OpenSearch Service The new version includes features that were launched as part of open source OpenSearch versions and Some of the key improvements include introduction of Simple Schema for Observability providing a unified schema for OpenSearch ability to add map visualisations to Dashboard panels and ability to filter geospatial data against geospatial field types support in security analytics for five new log types and the ability to define threat detectors with multiple indexes or index patterns This release also includes improvements to the Dashboards user interface UI such as the ability to manage data streams from the Index Management UI perform manual rollover and force merges for indices or data streams and the ability to manage multiple index templates using component templates The UI improvements also include support for dynamic tenant management ability to access observability features from the Dashboards main menu and the ability to add event analytics visualisations to dashboards Other improvements include support for flat object field type and support for replication of kNN indices in cross cluster replication KubernetesAmazon GuardDuty EKS Runtime Monitoring continuously monitors and profiles container runtime activity to identify malicious or suspicious behavior within container workloads Using a lightweight fully managed eBPF security agent GuardDuty monitors on host operating system level behavior such as file access process execution and network connections Once a potential threat is detected GuardDuty generates a security finding that pinpoints the specific container and includes details such as pod ID image ID EKS cluster tags executable path and process lineage The Amazon GuardDuty EKS Runtime Monitoring eBPF security agent now supports Amazon Elastic Kubernetes Service Amazon EKS workloads that use the Bottlerocket operating system AWS Graviton processors and AMD processors Additionally the new agent version introduces performance enhancements built in CPU and memory utilisation limits and support for Amazon EKS clusters PostgreSQLFollowing the announcement of updates to the PostgreSQL database by the open source community we have updated Amazon Aurora PostgreSQL Compatible Edition to support PostgreSQL and These releases contains product improvements and bug fixes made by the PostgreSQL community along with Aurora specific improvements This release also contains new features and improvements for Babelfish for Aurora PostgreSQL version and improved support for Amazon Web Services Database Migration Service version Aurora PostgreSQL target endpoint Babelfish data types Videos of the weekAWS On Air ft Open Source Security at AWSAWS is committed to raising standards for open source security by developing key security related technologies with community support and by contributing code resources and talent to the broader open source ecosystem We recently launched the open source Cedar policy language and SDK the technology that powers Amazon Verified Permissions for writing and enforcing authorisation policies for your applications using automated reasoning By open sourcing Cedar we are providing transparency into Cedar s development inviting community contributions and hoping to build trust in Cedar s security David Nalley and Mike Hicks talk more about open source security at AWS and the Cedar project in this discussion and demo Powertools for AWS LambdaThis is the follow up video from the first episode shared in where AWS Hero Johannes Koch and Heitor Lessa look at the Continuous Deployment Pipeline for the Powertools for AWS Lambda for Python It explores areas such as sealed source code automated testing and publishing and a cross regional roll out A must watch session as Heitor has really been raising the bar in architecting the infrastructure used to build this project Spring Cloud Function and AWS Performance Portability and ProductivitySpring Cloud Functions enables you to write simple Java Functions that combined with Spring Boot can easily be realised in a serverless environment such as AWS Lambda Implementing business logic as functions is a perfect fit for AWS Lambda s event driven architecture How to get the most from both of these technologies and its features Are there any useful tricks What about Spring Native and AOT In this video Oleg Z and Mark Sailes discuss and demonstrate these features and enhancements with a single function handling requests a second Build on Open SourceFor those unfamiliar with this show Build on Open Source is where we go over this newsletter and then invite special guests to dive deep into their open source project Expect plenty of code demos and hopefully laughs We have put together a playlist so that you can easily access all sixteen of the episodes of the Build on Open Source show Build on Open Source playlist We are currently planning the third series if you have an open source project you want to talk about get in touch and we might be able to feature your project in future episodes of Build on Open Source Events for your diaryIf you are planning any events in either virtual in person or hybrid get in touch as I would love to share details of your event with readers AWS Go Build On EKS Series CostOnline July th am UK timeThis is L event that is split into two sessions How to reduce costs is a topic on most peoples minds today in the first session that will cover how to reduce costs in your EKS workload by first covering common cost factors and easy fixes to bring down your overall spend You will also look at how to think about efficiency for your workload and how to calculate your costs down to the pod level You will also see a demo with Kubecost and what factors to think about when installing and setting it up In the second session you will see practical examples that combine different techniques to help you build for sustainability on Amazon EKS by making use of Graviton EC Spot effective autoscaling with Karpenter among other topics Register to save your space on the page AWS Go Build On EKS Series CostElastiCache for Redis Boost Your Startup s Performance with Lightning Fast DataOnline July th pm UK timeIn this online webinar you will learn more about some of the advanced features of Amazon ElastiCache and MemoryDB You will dive deep into the features of ElastiCache and MemoryDB explore Redis topologies and features look at ElastiCache and MemoryDB Use Cases find out about best practices for Highly Available architecture and hear about monitoring sizing and other best practices This is a L session and you can sign up to save your place on the registration page ElastiCache for Redis Boost Your Startup s Performance with Lightning Fast DataBuilding Scalable Microservices with TiDB and AWS LambdaAWS Offices in New York July th am pmTiDB is an open source New SQL database that supports Hybrid Transactional and Analytical Processing workloads It is MySQL compatible and can provide horizontal scalability strong consistency and high availability Join PingCAP and AWS for an in person workshop where you ll learn about TiDB Serverless and AWS Lambda You ll explore how to combine them to build scalable highly available microservices while generating real time insights directly from raw application data You will need to register to save your place so head over to Building Scalable Microservices with TiDB and AWS Lambda and find out more Discover AWS Cedar With its Creator and Community LeadersOnline July th EST GMTIf you are interested in Cedar then make sure you check out this must attend online event where discussing AWS new open source policy language and engine Cedar Learn more about its benefits ecosystem integrations and this new breakthrough in the IAM space Plenty of familiar faces including Mike Hicks Or Weis Daniel Aniszkiewicz and Filip Grebowski will be covering the Cedar policy language and how it works and you can expect plenty of demos Sign up and register your spot over at the Discover AWS Cedar With its Creator and Community Leaders page OpenSearchConSeattle September Registration is now open source OpenSearchCon Check out this post from Daryll Swager Registration for OpenSearchCon is now open that provides you with what you can expect and resources you need to help plan your trip CDK Day Online th September Back for the fourth instalment this Community led event is a must attend for anyone working with infrastructure as code using the AWS Cloud Development Kit CDK It is intended to provide learning opportunities for all users of the CDK and related libraries The CFP is open so if you have some ideas for some talks then make sure you check that section out Also this year they are accepting talks in Espanol Woohoo love it Check more at the website CDK Day CortexEvery other Thursday next one th FebruaryThe Cortex community call happens every two weeks on Thursday alternating at UTC and UTC You can check out the GitHub project for more details go to the Community Meetings section The community calls keep a rolling doc of previous meetings so you can catch up on the previous discussions Check the Cortex Community Meetings Notes for more info OpenSearchEvery other Tuesday pm GMTThis regular meet up is for anyone interested in OpenSearch amp Open Distro All skill levels are welcome and they cover and welcome talks on topics including search logging log analytics and data visualisation Sign up to the next session OpenSearch Community Meeting Stay in touch with open source at AWSRemember to check out the Open Source homepage to keep up to date with all our activity in open source by following us on AWSOpen |
2023-07-17 07:52:32 |
海外TECH |
DEV Community |
Open Source Licensing the Oracle Machine |
https://dev.to/polterguy/open-source-licensing-the-oracle-machine-3hag
|
Open Source Licensing the Oracle MachineThe system is rigged If you have two startups with competing products and one has inferior technology but is seed funded by YCombinator the one with seed funds from YCombinator will win I could give you a bajillion additional examples of systemic corruption but I suspect you already understand what I m talking about Free enterprise and fair competition is the laughing joke of the millennium The establishment will happily share whatever spoils they re able to steal from you with each other ensuring inferior products wins by using their monopolistic powers making sure whatever competing products exists out there never gets the light of day Unless you re funded by a triple A VC fund with ties to Silicon Valley you ll never create a successful company They re controlling the word and hence nobody hears about you unless they allow for it Hence in order to get what I deserve I will first have to destroy the system in its entirety I m therefor open source licensing our Oracle Machine to such destroy the business value of the following companies and organisations WikiPediaStackOverflowYCombinatorRedditGoogleFeel free to fill in the blanks The idea is to apply as much damage as possible to the above organisations making them irrelevant using any constructive means within my reach such that free enterprise and innovation without corruption can once again flourish Hopefully by open source licensing our Oracle Machine I can apply a lot of damage to all of the above organisations and companies You can find our repositories below Oracle Machine BackendOracle Machine FrontendWTF is the Oracle Machine WTF is the Oracle Machine The value propositionIf used intelligently the Oracle Machine can provide a small startup with a single employee infinite access to high quality content articles that scores high on SEO providing you with high quality backlinks to your website This allows you as an entrepreneur to focus on creating great products while spending only minutes per day on marketing Basically Destroying Google WikiPedia StackOverflow and YCombinator in one smash If used intelligently it will allow your original website to score highly on all SEO parameters and such pillage the SERP for your keywords making publications such as News YCombinator and other corrupt silicon valley publishing houses irrelevant You ll still need to understand SEO and have some basic theoretical knowledge of how search engines works and I suspect you re smart if you hide the fact that you re using the Oracle Machine somehow by for instance integrating it with your existing website or something But if you use it carefully and restrict yourself to your chosen keywords within your existing niche and don t produce more than articles per day You can balance the scale of innovation making value propositions provided by YCombinator and other corrupt VC organisations become irrelevant It equalises the playing field YCombinator I warned you Start paying me or I will turn your temples of worship into dust One line of code at the time |
2023-07-17 07:45:03 |
海外TECH |
DEV Community |
How to implement Adapter pattern in Ruby on Rails? |
https://dev.to/vladhilko/how-to-implement-adapter-pattern-in-ruby-on-rails-1ibg
|
How to implement Adapter pattern in Ruby on Rails OverviewIn this article we are going to discuss the adapter pattern We will explore what the adapter pattern is why we need it and how it can be implemented in a Rails application We will also examine the benefits and drawbacks it provides Additionally we will provide three different examples to help better understand the concept and the reasons behind it DefinitionIn simple terms the Adapter pattern allows you to transform the interface of a class into a different interface that clients expect By doing so it enables classes with incompatible interfaces to collaborate and work together seamlessly We usually use the Adapter pattern for the following purposes Integrating incompatible components When you need to integrate two components that have incompatible interfaces Adapter can bridge the gap by providing a common interface that allows them to work together seamlessly Implementing backward compatibilityIf you need to make changes to an existing class or component but want to maintain compatibility with code that relies on the old interface Adapter can be used to translate between the old and new interfaces Working with external libraries services or legacy systemsWhen integrating with external libraries or services that have their own specific interfaces Adapter can be used to adapt their interface to fit the interface expected by your application Adapters are also particularly useful when working with legacy systems or third party components that have outdated or incompatible interfaces The Adapter pattern has the following pros and cons Advantages Enhanced flexibilityThe Adapter pattern makes it easier to switch between implementations or integrate new components without modifying existing code Enhanced testabilityThe Adapter pattern enhances testability by isolating components facilitating the use of mocking and dependency injection and providing a clear interface for the adapted components Encapsulation of complexityThe Adapter pattern encapsulates complexity by simplifying complex logic hiding implementation details and separating concerns Independence from external solutionsThe Adapter pattern enables you to switch or replace the external solution without affecting the rest of your codebase This flexibility allows you to choose the most suitable external solution for your needs without the need for extensive modifications throughout your codebase Problems Potential increase in code complexity Introducing adapters can add complexity to the codebase especially when dealing with a large number of adapters or complex adaptation logic Higher learning curve for developers Developers who are not familiar with the Adapter pattern may require additional time and effort to understand the purpose design and usage of adapters in the codebase This learning curve can slow down development and onboarding processes for new team members ImplementationLet s start by implementing our first adapter We ll begin with a simple abstract example to help us grasp the concept Afterward we ll provide two additional examples and demonstrate how to integrate them into a Rails application Example Cat and DogIn this example we will discuss how two different objects with different interfaces can collaborate Let s imagine that we have the following two classes class Cat def self meow p Meow endendclass Dog def self woof p Woof endendCat meow gt Meow Dog woof gt Woof For example let s say we would like to use one of them based on a certain condition like the one shown below if Rails env test Cat meow else Dog woof endHaving many such conditions scattered throughout the code makes it difficult to maintain and less manageable Therefore we need to find a solution to address this issue The main goal of the Adapter Pattern is to solve this problem by creating a common interface without altering the initial implementation Let s explore how we can achieve it Firstly we need to create a new Adapter class for each class that we intend to adopt with a new interface class CatAdapter def self speak Cat meow endendclass DogAdapter def self speak Dog woof endendCatAdapter speak gt Meow DogAdapter speak gt Woof We have created a common interface for the two objects which allows us to choose which one to use in a single location without the need for additional changes throughout the application For instance we can implement the following code at the Rails configuration level AnimalAdapter Rails env test CatAdapter DogAdapterNow we can utilize AnimalAdapter throughout the entire application and this adapter will encapsulate the logic of Cat under the hoodAnimalAdapter speak gt Meow Example Temporary Data StorageIn this example we are going to implement a temporary data storage that should perform the following functions Add the given data by keyRetrieve the data by keyRemove the data by keySuch temporary data storage can be useful in various scenarios including Tracking the status of asynchronous jobsFor example if we have a button that sends multiple emails we can use temporary data storage to block the button and prevent duplicate sends until the job is finished and all emails are sent Storing intermediate results during data processingIn data intensive applications temporary storage is often used to store intermediate results during data processing pipelines This enables efficient data transformation aggregation or analysis before generating the final output Temporary storage for any other purposeTemporary data storage can be employed for various other use cases where we need to temporarily store data To demonstrate this adapter pattern we will create three different implementations for each Rails environment For the Test environment we will create a Memory Based Temporary Data Storage For the Development environment we will create an ActiveRecord Based Temporary Data Storage For the Production environment we will create a Redis Based Temporary Data Storage Let s proceed with implementing each of them to gain a better understanding of the main purpose behind the adapter pattern Memory Based Temporary Data StorageThe first solution will be the simplest one we will create a Memory Based Temporary Data Storage using a Ruby Hash We will only use this solution for testing purposes so we don t need to be concerned about data storage reliability First let s add a new adapter file and define the common interface app adapters temporary data store adapter memory rb frozen string literal truemodule TemporaryDataStoreAdapter class Memory def set key value end def get key end def delete key end endendFor our data storage we have defined three required methods setgetdeleteNow let s implement these methods app adapters temporary data store adapter memory rb frozen string literal truemodule TemporaryDataStoreAdapter class Memory def initialize store end def set key value store key to s value to json OK end def get key return nil unless value store key to s JSON parse value end def delete key return nil unless value store key to s store delete key to s JSON parse value end endendLet s check how it actually works in the Rails console rails cadapter TemporaryDataStoreAdapter Memory new gt lt TemporaryDataStoreAdapter Memory xbb store gt adapter set key example example gt OK adapter get key gt example gt example adapter delete key gt example gt example adapter get key gt nilThat s it P S It also makes sense to add a clear method to remove all data from the Hash You can run this command after every test that uses this adapter to avoid any global value problems in the specs def clear store clearend ActiveRecord Based Temporary Data StorageOur second implementation of Temporary Data Storage utilizes ActiveRecord I have mainly added this implementation to enhance our understanding of the Adapter concept rather than for real life usage Firstly to use ActiveRecord we need to create a new model and generate a migration to set up the data storage Let s create the following model app models temporary data entry rb frozen string literal trueclass TemporaryDataEntry lt ApplicationRecordendAnd the corresponding migration db migrate create temporary data entries rbclass CreateTemporaryDataEntries lt ActiveRecord Migration def change create table temporary data entries do t t string key null false t json data t timestamps t index key unique true end endendAfterward execute the following command rails db migrateNow everything is set up and we can proceed to create a new adapter Let s take a look at how this adapter will be implemented app adapters temporary data store adapter active record rb frozen string literal truemodule TemporaryDataStoreAdapter class ActiveRecord def set key data temp data entry TemporaryDataEntry find by key key if temp data entry present temp data entry update data data else TemporaryDataEntry create key key data data end OK end def get key TemporaryDataEntry find by key key amp data end def delete key TemporaryDataEntry find by key key amp delete amp data end endendLet s test these methods in the console rails cadapter TemporaryDataStoreAdapter ActiveRecord new gt lt TemporaryDataStoreAdapter ActiveRecord xec gt adapter set key example example gt OK adapter get key gt example gt example adapter delete key gt example gt example adapter get key gt nilThat s it Redis Based Temporary Data StorageOur final implementation will be based on Redis First of all let s install the Redis gem and ensure that we are connected to the server Add the following gem to your Gemfile Gemfile gem redis Then execute bundle installNext start the Redis server and verify if the connection is successful REDIS URL redis rails c Redis new url ENV fetch REDIS URL info gt redis version gt As we can see the Redis connection is successful Now let s proceed with adding a new Redis based temporary data storage adapter app adapters temporary data store adapter redis rb frozen string literal truemodule TemporaryDataStoreAdapter class Redis def set key value redis set key value to json end def get key return nil unless value redis get key JSON parse value end def delete key return nil unless value redis getdel key JSON parse value end private def redis redis Redis new url ENV fetch REDIS URL end endendLet s check if it works adapter TemporaryDataStoreAdapter Redis new gt lt TemporaryDataStoreAdapter Redis xc gt adapter set key example example gt OK adapter get key gt example gt example adapter delete key gt example gt example adapter get key gt nilThat s it All adapters have been successfully added Now let s take a look at how we can choose the one that is suitable for our needs Initialize AdapterWe re going to use different adapters depending on the Rails environment For the TEST environment we ll use the Memory Based Temporary Data Storage Adapter For the DEVELOPMENT environment we ll use the ActiveRecord Based Temporary Data Storage Adapter For the PRODUCTION environment we ll use the Redis Based Temporary Data Storage Adapter To solve this problem we ll initialize each adapter in their respective environments in the config environments directory Memory Based Temporary Data Storage Adapter config environments test rb frozen string literal trueRails application configure do config after initialize do config temporary data store adapter TemporaryDataStoreAdapter Memory new endendActiveRecord Based Temporary Data Storage Adapter config environments development rb frozen string literal trueRails application configure do config after initialize do config temporary data store adapter TemporaryDataStoreAdapter ActiveRecord new endendRedis Based Temporary Data Storage Adapter config environments production rb frozen string literal trueRails application configure do config after initialize do config temporary data store adapter TemporaryDataStoreAdapter Redis new endendOnce initialized we can call this adapter like this rails cRails application config temporary data store adapter gt lt TemporaryDataStoreAdapter ActiveRecord xec gt However this doesn t look too convenient so let s add a wrapper app adapters adapter rb frozen string literal truemodule Adapter class lt lt self def method missing method args amp block Rails application config public send method adapter rescue NameError super end endendNow we can call the adapter like this rails c e developmentAdapter temporary data store gt lt TemporaryDataStoreAdapter ActiveRecord xc gt rails c e testAdapter temporary data store gt lt TemporaryDataStoreAdapter Memory xa store gt rails c e productionAdapter temporary data store gt lt TemporaryDataStoreAdapter Redis xef gt That s it We now have an identical interface and our previous incompatible solution becomes interchangeable Example Datadog In the last example we will add an Abstract Monitoring Adapter to consolidate what we ve already learned Let s imagine that we have DataDog monitoring and we want to disable it for the development and test environments How would we do it Let s create the following app adapters monitoring adapter datadog rb frozen string literal truemodule MonitoringAdapter class Datadog def call puts Send a real API request to DataDog end endendAnd the Fake one frozen string literal truemodule MonitoringAdapter class Fake def call puts Pretend that the request to DataDog has been sent end endendLet s initialize them Fake config environments test rb frozen string literal trueRails application configure do config after initialize do config monitoring adapter MonitoringAdapter Fake new endendDatadog config environments production rb frozen string literal trueRails application configure do config after initialize do config monitoring adapter MonitoringAdapter Datadog new endendAnd let s check rails c e productionAdapter monitoring call gt Send a real API request to DataDog rails c e testAdapter monitoring call gt Pretend that the request to DataDog has been sentThat s it ConclusionIn summary the adapter pattern proves to be a valuable tool in Rails applications enabling the integration of components with different interfaces without modifying existing code By creating adapter classes for each component we can achieve a common interface and easily switch between implementations This approach enhances code maintainability reduces complexity and improves testability by isolating components and providing clear contracts With adapters developers can seamlessly integrate external libraries or services adapt to legacy systems and handle compatibility issues efficiently |
2023-07-17 07:44:20 |
海外TECH |
DEV Community |
The psychology 🧠of debugging - It Works 💯 |
https://dev.to/elliot_brenya/the-psychology-of-debugging-it-works-26a9
|
The psychology of debugging It Works Debugging code can be one of the most frustrating parts of programming As a developer based in Ghana I ve had my fair share of late nights spent tracking down elusive bugs When a piece of code doesn t work as expected it s easy to get annoyed and even question your abilities However it s important to remember that debugging is a key part of the learning process The best programmers aren t the ones who write perfect code they re the ones who know how to methodically work through problems Debugging teaches you how your code really works under the hood Like learning a new language you have to make mistakes first before truly understanding things The key is to reframe debugging as a positive challenge rather than a personal failure I try to approach each bug with curiosity why is this happening and what can I learn from it This growth mindset helps me stay calm and focused on the problem at hand I also take breaks when needed If I m clearly too frustrated to think straight I ll take a short walk or listen to some afrobeats music Stepping away often allows me to return with fresh eyes and new insights It also helps to debug collaboratively As developers in Ghana we like to debug together on voice calls Explaining your problem out loud and hearing another perspective can reveal overlooked solutions Don t be afraid to ask for a second pair of eyes With patience and the right mindset debugging can actually be fun There s no better feeling than finally squashing that pesky bug Each fixed issue makes you a stronger programmer Even after years of coding I still get excited when I fix a tough error That s the beauty of a growth mindset there s always more to learn if you stay open and positive Feel free to connect with me on Twitter elliot mlaidv |
2023-07-17 07:13:49 |
海外TECH |
DEV Community |
UTM for Developers |
https://dev.to/crabnebula/utm-for-developers-1gec
|
UTM for DevelopersUTM makes it easy to set up and manage macOS and Windows virtual machines This can be especially useful for developers such as Tauri contributors who need to test their applications across multiple platforms or for those looking to experiment with different operating systems without affecting their primary system In this tutorial we set up macOS and Windows virtual machines on UTM a macOS application that provides a GUI wrapper for QEMU a powerful open source emulator and virtualizer UTM allows you to easily manage and run virtual machines without memorizing complex commands It also has special handling for macOS making it simpler to install compared to other virtual machine software Before you get started download and install UTM Setting up a macOS Virtual Machine To create a macOS virtual machine Click on “Create Virtual Machine Choose “Virtualize Select “macOS Select an IPSW iPhone Software although this name is used for all Apple operating systems or let UTM download it automaticallyConfigure system settings like CPU cores and storageSave the configuration and UTM will automatically download the macOS installer and install the OS on the virtual hard drive Setting up Windows Virtual MachineFor a Windows virtual machine Make sure you have the necessary prerequisites installed such as Homebrew See the instructions provided in the Windows install guide on the UTM website for detailed information Here s the guide for Windows from UTM s websiteBuild the Windows ISO fileCreate a new virtual machine in UTM and choose “Windows Uncheck the “Directly boot ISO CD image option and ensure “Install drivers and SPICE tools is checkedUpload the ISO file you created earlierConfigure other settings like memory CPU cores and storage as neededAfter saving the configuration UTM will boot the Windows installer and you can follow the usual Windows installation process Managing Virtual MachinesBoth macOS and Windows virtual machines can be customized and managed through UTM s user interface You can adjust settings add devices and even set up shared folders to share files between the host and guest operating systems Shared folders serve as a great method for passing files between the host OS and the VM guest In order to make this possible Enable the shared folder in the macOS settings under “General gt “Sharing gt “File Sharing Locate the shared folder in “ users Public Note you can drag the folder into the sidebar for easier access The VM guest will now show up in your host machines Finder under “Network For more detailed guidance on setting up and managing virtual machines with UTM you can watch the livestream replay on YouTube Author Jonas Kruckenberg DevRel |
2023-07-17 07:13:30 |
海外TECH |
DEV Community |
New Power Automate GPT Connector |
https://dev.to/wyattdave/new-power-automate-gpt-connector-58g7
|
New Power Automate GPT ConnectorThere are lots of ways to get Chat GPT power into your flows but before they require OpenAI and a custom connector But now there is a new connector out of the box and ready to go with no effort The Create text with GPT is an AI builder connector but there are a couple of points to considerIt s in preview so might changeAs its preview its free which is great but we don t know how much it will costRequires environment feature ins settings AI Builder Preview Models turned onThis is GPT not Chat GPT so same model but hosted by Mircosoft not OpenAIAgain as its in preview so little documentation around token limitThat a side its pretty cool and it solves problems that only a Human in the Loop could of Check out my previous blog on why Human in the Loop is so Powerful here To use the connector we need to create the prompt we are going to send to the model we can paste it directly in or the easier way use the prompt templates The connector has default templates and create from blank The templates are based on prompts and though they cover the most used there are many more Summarize TextExtract InformationClassify TextCreate BlogAll the rest Create from blank Summarize TextAs you can see we have main parts of out prompt creatorPromptQuestion Key DataFail threshold and actionContext textIn this template itsSummarizeLess then paragraphs without adding new informationIf less than paragraph return can t summarise a text that is too short include your text hereSo to use it we would change include your text here to our input text We can test the model to make sure it s right and make edits to any of the parts Context text Summarized Test This text announces the public preview of Customer Managed Encryption Keys CMK for Power Automate CMK provides customers with added control over their data especially for highly regulated industries like Healthcare and Financial Services It allows customers to leverage an encryption key from their own Azure Key Vault and apply it to any Power Platform environment Additionally admins can lock an environment to ensure total lockdown of their data Extract InformationExtract has the same parts but in this example changed to Extract Container numbers and registration numbersIf less than couple of words answer that you can t extract information We are pleased to inform you that the shipment clearances for If we test with the same context text from here we get Extracted text Author Rakesh KrishnanWhere to send questions and feedback Power Automate Communitypretty cool Classify TextYou can see the pattern we just change our main parts like prompt Create BlogAgain same pattern but this one shows how far you can push the Question Key Data to get as precise as you need Context text Summarized Test Power Automate is an incredibly powerful tool that can help businesses automate their processes and tasks saving time and money With Power Automate businesses can choose from a variety of subscription plans to best fit their needs The per user plan is great for individuals who want to automate cloud apps services and data with digital process automation DPA This plan allows people to create and run unlimited flows for just per user month For businesses that need to automate legacy apps on a desktop the per user plan with attended RPA is the best option This plan includes the same capabilities as the base plan plus the ability to automate legacy apps on a desktop via robotic process automation RPA in attended mode for per user month For businesses that want the flexibility to pay only when a person runs a flow connected with premium connectors the pay as Not bad effort with the notoriously difficult Microsoft Licenses All the rest Create from blank After using it a few times you will find yourself using the blank more often as you can see the structure is consistent and easy to follow without templates I use Find for answering questions like above for You can also translateCount Sumand much much more you are only limited by the prompts you can think of Yep it even writes code that works too Chat GPT was democratized AI I think Create text with GPT connector democratizes AI to Low Code and Power Automate DevelopersFurther ReadingInspired by Documentation |
2023-07-17 07:13:26 |
海外TECH |
DEV Community |
Evolution of Web : From HTML to AI |
https://dev.to/developedbyjk/evolution-of-web-from-html-to-ai-2o93
|
Evolution of Web From HTML to AIThe internethas become an indispensable part of our lives ️ providing us with informationon google entertainmentwith youtube or tiktok or just watching some memesonline wow internet is fun isnt it Buthave you ever wonderedhow the web has transformed over the years From its humble beginnings with HTMLand the blue link with static page and that unattractive random colorpages to the advent of artificial intelligence AI developing the machinesto talk and work like real human With AI Manipulatingthe Video and creating video of your Exsaying silly shitwith just her voiceand imageisn t it fascinating or AI creating imagesand memes that only few canunderstand like thisokay i guesss that is some weird meme but good news Aican t replace human HumorBack to topic The Internethas come a long Wayso get your swim suitwith extra oxygenbecause we are going to deep dive ️in the Evolution of Html to AI HTML The Foundation of the WebIn the early s the World Wide Webemerged as a platform for sharing information When our guy Tim Berners Leewanted to share some awesomestuff to his colleague So heHTML HyperText Markup Language and this played a crucial role in this revolution Html provided a standardizedway to structure and display content on web pages using tags lt gt and elements With HTML static web pages became a reality enabling users to accessand consume information more efficiently Thank you Tim SirApplause CSS Beautifying the WebAhhthat static page were becoming boringthere is something need to be done to make them beautifulWhile HTMLlaid the groundwork for web development Cascading Style Sheets CSS brought a new level of aesthetic appeal to websites thanks to danny for this homepageIntroducedin the late s CSS allowed developersto separate contentfrom presentation making it easier to styleand format web pages The introduction of CSSrevolutionizedweb design giving rise to visuallystunning websites that captured the attention of users JavaScript Enabling InteractivityThe next major leap in web evolutioncame with the introductionof JavaScript With Js you were able to do a lot of funthings interact with machines️were not being so smoothbeforeJS Developed in the mid s JavaScriptempowered developers to add interactivityand dynamic elementsto web pages With JavaScript websitescould respond to user actions perform calculations and update content in real time This breakthrough transformed the webfrom a static information repository to an interactive platform Web The Rise of User ParticipationAfter getting happywith interacting with websites Humanswanted to add their stuff️on InternetThere came Web The early s marked the advent of Web a paradigm shift that emphasized userparticipation and collaboration Social media platforms blogging sites and content sharing platformsallowed users to create and share their own content A human creating contentThis result into communityand interconnectivity Web democratized the web giving everyone a voiceand transforming the internetinto a platform for social interactionand content creation Mobile Revolution Web Anytime AnywhereThe proliferation of smartphonesand mobile devices brought about the mobile revolution Websites had to adapt to varying screensizes and resolutions leading to the development of responsive web design Responsive design ensured that websiteswere accessible and user friendlyacross different devices enabling usersto access the web anytime and anywhere Cloud Computing Scalability and FlexibilityCloud computingrevolutionized the way websiteswere hosted and managed Instead of relying on physical servers businesses could now leverage cloud infrastructureto store and process data Cloud computing provided scalability flexibility and cost efficiency allowing businesses to focus on their core operationsrather than managing complex IT infrastructure Big Data Insights and PersonalizationThe exponential growth of data gave birth to big data analytics Websitesbegan leveraging this data to gain insightsinto userbehavior and preferences Through data analysis businesses could personalizeuser experiences deliver targeted content and make data driven decisions Big dataopened up new possibilitiesfor understanding users and tailoring web experiences to their individual needs Machine Learning Powering Intelligent WebsitesMachine learningalgorithmsbrought intelligence to the web From recommendation systems to fraud detection websites started utilizing AI algorithmsto analyze data make predictions and automate processes Machine learning empowered websites to provide personalized recommendations improve search results and enhance user engagement Natural Language Processing Conversational InterfacesNatural Language Processing NLP revolutionized userinteractions with websites Websites equipped with NLP capabilities could understand and respond to human And this all Gave Rise toHiI am Juned KhanI write design and code on webI love Memes amp Emojisand used in this post Well It took me TimeBut I wanted to share something new Hope you liked itOhhyess I create such informative visual on Instagram developedbyjk For reading this I wish you good luck on your web development journeyHere a gift for reading Free HTML Quick GuideMy more Link |
2023-07-17 07:01:03 |
海外ニュース |
Japan Times latest articles |
Japan ramps up support for startups venturing overseas |
https://www.japantimes.co.jp/news/2023/07/17/business/startups-overseas-success-initiatives/
|
japan |
2023-07-17 16:41:02 |
海外ニュース |
Japan Times latest articles |
Searing temperatures across Japan trigger heatstroke concerns |
https://www.japantimes.co.jp/news/2023/07/17/national/searing-temperatures-across-japan-trigger-heatstroke-concerns/
|
nation |
2023-07-17 16:28:27 |
ニュース |
BBC News - Home |
Plan to crack down on 'rip-off' university degrees |
https://www.bbc.co.uk/news/uk-politics-66216005?at_medium=RSS&at_campaign=KARANGA
|
skilled |
2023-07-17 07:20:02 |
ニュース |
BBC News - Home |
'Inevitable' jobs will be more automated, says new AI adviser |
https://www.bbc.co.uk/news/technology-66128106?at_medium=RSS&at_campaign=KARANGA
|
hogarth |
2023-07-17 07:19:57 |
コメント
コメントを投稿