投稿時間:2023-07-23 02:14:53 RSSフィード2023-07-23 02:00 分まとめ(17件)

カテゴリー等 サイト名等 記事タイトル・トレンドワード等 リンクURL 頻出ワード・要約等/検索ボリューム 登録日
python Pythonタグが付けられた新着投稿 - Qiita SwiftUI × FastAPI 環境構築 https://qiita.com/zaki_barrow/items/278e7ca07f6beaf2c6eb fastapi 2023-07-23 01:34:37
python Pythonタグが付けられた新着投稿 - Qiita ABC311をPythonで解いてみたよ。(A~E問題) https://qiita.com/hyouchun/items/375e18daf24a4cd91f85 atcoder 2023-07-23 01:29:43
海外TECH Ars Technica Here’s the trailer for the live-action One Piece we’ve been waiting for https://arstechnica.com/?p=1955937 anime 2023-07-22 16:31:03
海外TECH MakeUseOf The Top 7 Location-Based Reminder Apps You Should Try https://www.makeuseof.com/top-location-based-reminder-apps/ reminder 2023-07-22 16:45:24
海外TECH MakeUseOf ChatGPT App Not Working on Your iPhone? 9 Fixes to Try https://www.makeuseof.com/chatgpt-app-not-working-on-iphone-fixes/ chatgpt 2023-07-22 16:30:22
海外TECH MakeUseOf How to Import Photos From Your Camera Onto a Windows 10 PC https://www.makeuseof.com/import-photos-from-camera-windows-10/ windows 2023-07-22 16:16:23
海外TECH MakeUseOf Innocn 27M2V: An Affordable Mini-LED Gaming Monitor That Is Anything but Cheap https://www.makeuseof.com/innocn-27m2v-review/ Innocn MV An Affordable Mini LED Gaming Monitor That Is Anything but CheapFeaturing K at up to Hz variable refresh HDR and low input lag Despite the plastic build the value is unmatched for such high spec features 2023-07-22 16:06:23
海外TECH DEV Community PHP - Create your own Data Validator in PHP: Step-by-Step https://dev.to/fadymr/php-create-your-own-data-validator-in-php-step-by-step-3pbi PHP Create your own Data Validator in PHP Step by Step How to Create Your Own Data Validator in PHPIn this tutorial I will teach you how to create a custom data validator in PHP by building our own validation library step by step Data validators are essential tools for any developer who needs to ensure that user submitted data is valid and secure By the end of this tutorial you will have a solid understanding of how to create custom data validators in PHP which will allow you to better handle user inputs and ensure the security of your applications Step Creating the Validation ClassThe first step is to create a class that will handle the validation This class should be able to store the validation rules for each field that we want to validate as well as validate those rules when called Here is an example of a simple validation class lt phpnamespace DevCoder Validator use DevCoder Validator Assert ValidatorInterface use InvalidArgumentException use function get class use function gettype use function is array use function is object use function sprintf class Validation var array lt string array gt private validators var array lt string string gt private errors var array private data public function construct array fieldValidators foreach fieldValidators as field gt validators if is array validators validators validators this gt addValidator field validators public function validate array data bool this gt data data var validators array lt ValidatorInterface gt foreach this gt validators as field gt validators if isset this gt data field this gt data field null foreach validators as validator if validator gt validate this gt data field false this gt addError field string validator gt getError return this gt getErrors return array lt string string gt public function getErrors array return this gt errors return array public function getData array return this gt data private function addError string field string message void this gt errors field message param string field param array lt ValidatorInterface gt validators return void private function addValidator string field array validators void foreach validators as validator if validator instanceof ValidatorInterface throw new InvalidArgumentException sprintf field validator must be an instance of ValidatorInterface s given is object validator get class validator gettype validator this gt validators field validator Step Creating rule classes for data validationNow that we have created the Validator class the next step is to create our own validation rules These rules will be used to check if the submitted data is valid or not We will create them in separate files one for each validation rule Each validation rule file should contain a class named after the rule it implements For example if we have a validation rule to check if a value is an integer we will name the class Integer ValidatorInterface lt phpnamespace DevCoder Validator Assert interface ValidatorInterface public function validate value bool public function getError string AbstractValidator lt phpnamespace DevCoder Validator Assert abstract class AbstractValidator implements ValidatorInterface var string null protected error public function getError string return this gt error protected function error string message array context void replace foreach context as key gt value if is object value value method exists value toString string value get class value elseif is array value value json encode value else value string value replace key value this gt error strtr message replace Integer lt phpdeclare strict types namespace DevCoder Validator Assert use function ctype digit use function is int use function strval class Integer extends AbstractValidator var string private invalidMessage This value should be of type type private minMessage value should be limit or more private maxMessage value should be limit or less var int null private min var int null private max public function validate value bool if value null return true if ctype digit strval value false this gt error this gt invalidMessage value gt value type gt integer return false if is int this gt min amp amp value lt this gt min this gt error this gt minMessage value gt value limit gt this gt min return false if is int this gt max amp amp value gt this gt max this gt error this gt maxMessage value gt value limit gt this gt max return false return true public function invalidMessage string invalidMessage self this gt invalidMessage invalidMessage return this public function minMessage string minMessage self this gt minMessage minMessage return this public function maxMessage string maxMessage self this gt maxMessage maxMessage return this public function min int min self this gt min min return this public function max int max self this gt max max return this NotNull lt phpdeclare strict types namespace DevCoder Validator Assert class NotNull extends AbstractValidator private message This value should not be null public function validate value bool if value null this gt error this gt message value gt value return false return true public function message string message self this gt message message return this Step Creating an instance of the Validation classThis object takes an array of validation options as input The keys of the array are the names of the fields and the values are arrays of validators lt php validation new Validator age gt new Integer gt min gt max new NotNull number of children gt new NotNull new Integer salary gt new NotNull new Integer Step Data ValidationOnce you have created an instance of the Validation class you can validate the data by calling the validate method of the Validation class This method will return true if all validation rules are satisfied and false otherwise lt phpif validation gt validate POST true data validation gt getData save in database redirect in another page return render template html php errors gt validation gt getErrors Example of other rules that can be added validation new Validation email gt new NotNull new Email password gt new NotNull firstname gt new NotNull new StringLength gt min new Alphabetic lastname gt new StringLength gt min gender gt new Choice Mme Mr null website gt new NotNull new Url age gt new NotNull new Integer gt min invoice total gt new NotNull new Numeric active gt new NotNull new Custom function value return is bool value To see other validation rules that can be added check out my GitHub repository at the following URL Ideal for small projectSimple and easy 2023-07-22 16:15:48
海外TECH DEV Community Automate Python Linting and Code Style Enforcement with Ruff and GitHub Actions https://dev.to/ken_mwaura1/automate-python-linting-and-code-style-enforcement-with-ruff-and-github-actions-2kk1 Automate Python Linting and Code Style Enforcement with Ruff and GitHub Actions IntroductionMaintaining a consistent code style and following best practices is important for any Python project However manually fixing linting issues can be tedious and easy to forget In this tutorial we ll see how to set up GitHub Actions to automatically lint Python code with Ruff on every push and commit any fixes In the previous tutorial we saw how to use Ruff to lint Python code However manually fixing linting issues can be tedious and easy to forget PrerequisitesA GitHub repository with Python codeGitHub Actions enabledRuff installed locally Creating the WorkflowThe workflow will run on pushes to lint with Ruff and commit fixes name Lint and Commiton pushjobs lint runs on ubuntu latest steps uses actions checkout v uses actions setup python v run pip install ruff run ruff check ruff fix uses stefanzweifel git auto commit action v with commit message style fixes by ruff The new key step is the git auto commit action which will detect changes made by Ruff and commit them with a message Configuring RuffWe can create a ruff toml or ruff toml file to customize Ruff line length target version py select E W The code above sets the line length to characters the target Python version to and the rules to E and W Read more about the configuration options here Using the Ruff GitHub ActionWe can also use the Ruff GitHub Action to run Ruff in our workflow name Lint and Commiton push pull request jobs lint runs on ubuntu latest steps uses actions checkout v uses actions setup python v uses chartboost ruff action v with args check fix args fix uses stefanzweifel git auto commit action v with commit message style fixes by ruff The code above is equivalent to the previous workflow The main difference is that we are using the Ruff GitHub Action instead of running Ruff directly Read more about the Ruff GitHub Action hereEach approach has its own advantages Running Ruff directly gives us more control over the Ruff process We can specify a custom configuration file for example Using the Ruff GitHub Action is simpler and requires less code Running Ruff directly is marginally faster because we don t need to install Ruff every time This will run Ruff with the check and fix arguments We can also use the config argument to specify a custom configuration file name Lint and Commiton pushjobs lint runs on ubuntu latest steps uses actions checkout v uses actions setup python v uses chartboost ruff action v with args check fix args fix config ruff toml uses stefanzweifel git auto commit action v with commit message style fixes by ruff This will adjust the rules to our preference This is useful if we want to use different rules for different projects Pick the approach that works best for your project Auto Commit ActionWe are also using the auto commit action to commit the changes We can configure the commit message and other options Read more about the auto commit action here This is optional we can also use the git action to commit the changes manually The combination of Ruff and the auto commit action allows us to automatically fix linting issues and commit the changes on every push This helps us maintain a consistent code style and follow best practices over time Viewing ResultsThe Actions output will now show files changed and committed by Ruff The commits will also be visible in the repository history Feel free to check out my for an example workflow usage run KenMwaura FastAPI Backend Template A backend project template with FastAPI PostgreSQL with asynchronous SQLAlchemy Alembic for asynchronous database migration and Docker FastAPI Backend Application Template This is a template repository aimed to kick start your project with a setup from a real world application This template utilizes the following tech stack DockerizedAsynchronous PostgreSQLFastAPIWhen the Docker is started these are the URL addresses Backend Application API docs rightarrow http localhost docsDatabase editor Adminer rightarrow http localhost The backend API without Docker can be found in http localhost docs Why the above Tech Stack Well the easy answer is Asynchronousity and Speed FastAPI is crowned as the fastest web framework for Python and thus we use it for our backend development The database of my choice is the asynchronous version of PostgreSQL via SQLAlchemy Read this blog from Packt if you want to educate yourself further about the topic Asynchronous Synchronous Concurrency and Parallelism Docker is a technology that packages an application into standardized units called containers that have… View on GitHub ConclusionWith auto commit enabled we no longer need to manually fix linting issues flagged by Ruff the fixes are applied automatically on each push This helps enforce a consistent code style over time Some next steps Set Ruff to run on pull requests too Configure commit options like commit user and branch Set up Slack email notifications By integrating linters like Ruff into our GitHub workflows we can automate parts of the coding process and reduce manual work Let s connect on Twitter and LinkedIn ReferencesRuff documentation GitHub Actions git auto commit action ruff action 2023-07-22 16:15:29
海外TECH DEV Community Language Models For Dummies #2 - Popular Language Models 🤖 https://dev.to/layanyashoda/language-models-for-dummies-2-popular-language-models-6cp Language Models For Dummies Popular Language Models What is a Parameter In the context of machine learning and neural networks a parameter refers to a value or set of values that a model learns from data during the training process Parameters are the variables that define the structure and behavior of the model determining its ability to make predictions or generate outputs In a neural network parameters are associated with the connections between neurons also known as weights These weights represent the strength of the connections and play a crucial role in determining how information flows through the network Adjusting the weights allows the model to learn and adapt to the patterns present in the training data Parameters are learned by optimizing a specific objective function often using a technique called backpropagation During training the model s parameters are iteratively adjusted to minimize the difference between the predicted outputs and the true outputs of the training examples This process involves calculating gradients and updating the parameter values accordingly The values of parameters capture the knowledge and patterns learned by the model from the training data Once the training is complete the optimized parameters enable the model to make accurate predictions or generate relevant outputs for new unseen inputs The number of parameters indicates the size and complexity of the model but it does not indicate the quality of the model Larger parameter counts generally allow the model to capture more nuanced patterns and exhibit improved performance but they also require more computational resources for training and inference Popular Language ModelsThere are numerous language models available today each with its own unique features architecture and applications Here is a list of some prominent language models along with a brief explanation GPT Generative Pre trained Transformer GPT is a transformer based language model developed by OpenAI It employs a multi layer transformer architecture which enables it to capture long range dependencies in text effectively GPT models are pre trained on massive amounts of internet text data allowing them to learn rich linguistic patterns context and semantics They excel in generating coherent and contextually relevant text making them valuable for tasks such as text completion dialogue generation and language understanding Developer OpenAIParameter Count The original GPT model has million parameters but there are also larger versions like GPT and GPT which have billion and billion parameters respectively BERT Bidirectional Encoder Representations from Transformers BERT developed by Google introduced a breakthrough by learning bidirectional representations of words Unlike previous models that relied on left to right or right to left contexts BERT considers both directions providing a more comprehensive understanding of word context BERT is pre trained on large scale corpora and fine tuned for specific tasks achieving impressive results in natural language processing tasks including sentiment analysis question answering and text classification Developer GoogleParameter Count BERT has different versions with varying sizes The base BERT model has million parameters and larger versions like BERT Large can have million parameters XLNet eXtreme Language Understanding Network XLNet builds upon the concept of bidirectionality in BERT and introduces a permutation based training approach It considers all possible word permutations in a sentence allowing the model to capture dependencies without relying on the traditional left to right or right to left sequential order XLNet achieves state of the art performance in various tasks including coreference resolution document ranking and machine translation Developer Google CMUParameter Count XLNet has different model sizes The base XLNet model has around million parameters and larger versions can have hundreds of millions of parameters Transformer XLTransformer XL is an extension of the transformer model that addresses the limitation of traditional transformers in handling long range dependencies It introduces recurrence mechanisms such as relative positional encodings and a segment level recurrence mechanism called memory which enable the model to retain memory of past information This allows Transformer XL to better capture long term dependencies making it more effective in tasks such as language modeling and document classification Developer Google CMUParameter Count The parameter count of Transformer XL depends on the model size and configurations used It can range from tens of millions to hundreds of millions of parameters T Text To Text Transfer Transformer T is a versatile language model developed by Google designed to handle various text related tasks using a unified framework It takes a text to text approach where different tasks are converted into a text to text format allowing the model to be trained consistently T is trained on a vast amount of data and has achieved state of the art results on numerous NLP benchmarks including text classification machine translation question answering and text summarization Developer GoogleParameter Count The T model has different sizes and versions For instance T Base has million parameters while T models can have up to billion parameters RoBERTa Robustly Optimized BERT Pre training Approach RoBERTa is an optimized version of BERT that incorporates improvements in the training process It employs larger batch sizes more training data and longer training duration compared to BERT These optimizations allow RoBERTa to achieve enhanced performance across various NLP tasks such as natural language inference sentence level classification and document classification Developer Meta AIParameter Count The RoBERTa model has various sizes typically ranging from million to million parameters depending on the specific configuration used ALBERT A Lite BERT ALBERT addresses the scalability and efficiency challenges of BERT by introducing parameter reduction techniques It reduces the number of parameters while maintaining comparable performance to BERT making it more memory efficient and computationally efficient ALBERT is particularly useful in scenarios with limited computational resources enabling the deployment of powerful language models in resource constrained environments Developer GoogleParameter Count ALBERT introduces parameter reduction techniques compared to BERT The model sizes range from relatively smaller versions such as ALBERT Base with million parameters to larger ones like ALBERT xxlarge with million parameters It s important to note that the parameter counts provided here are approximate and can vary depending on the specific versions configurations and variations of the models These numbers are based on information available up until September and newer models or updates may have been released since then I hope this article has provided you with valuable insights If you believe this information can benefit others please show your support by liking the post allowing it to reach a wider audience ️I welcome your thoughts and questions so don t hesitate to leave a comment and engage in further discussion Also Don t forget to drop a follow Layan YasodaFollow Tech Enthusiast𖡡Colombo Sri Lanka ༄I m not popular just well liked 2023-07-22 16:14:29
海外TECH DEV Community Data Migration Strategies in Ruby on Rails: The Right Way to Manage Missing Data https://dev.to/vladhilko/data-migration-strategies-in-ruby-on-rails-the-right-way-to-manage-missing-data-3dbe Data Migration Strategies in Ruby on Rails The Right Way to Manage Missing Data OverviewIn this article we re going to discuss the possible strategies for migrating generating and backfilling data in a Rails application We ll implement them improve them consider their pros and cons and discuss which ones are better to use in different scenarios By the end of the article we will have a full picture of the different ways to solve data migration problems IntroductionIn simple terms data migration is the process of adding updating or transferring some data inside your application The most popular cases for data migration are as follows Backfilling column dataMoving column data from one table to anotherGenerating new database recordsUpdating corrupted or invalid data with correct valuesRemoving unused dataWe ll consider different ways to do it Direct Data ManipulationRake TaskData Migration Gem Direct Data ManipulationThe first option is the simplest one we ll just add missing data via rails c or through a direct database connection in production Advantages EasyNo need to implement anything newIt s fast because data migration can be done in minutes Problems Too risky changes may not end up as intendedPossible access and security problemsThere are no tests and code reviews so we can t be sure of the qualityLack of control you don t know who ran the migration or why they ran it Rake TaskThe second option is the rake task In this chapter we will try to understand how to properly add rake tasks ensure that they work correctly learn their pros and cons and explore how they can be used for data migration We will start by adding the simplest rake task and then we will proceed to improve its structure cover the logic with tests and consider using best practices for writing data migration using rake tasks Let s imagine that we have an Animal model with the following fields idkindstatuscreated atupdated atAnd we need to change the status value from nil to reserved for all animals that we created before today How can we do it Let s start by adding a simple rake task template frozen string literal true lib tasks animals backfill statuses rakenamespace animals do desc Update animal status to reserved for animals created before today task backfill statuses environment do puts Updating animal status endendAnd check that it works rake animals backfill statuses gt Updating animal status The task has been executed and works as expected Now let s add the actual code with the database update It will look like the following frozen string literal true lib tasks animals backfill statuses rakenamespace animals do desc Update animal status to reserved for animals created before today task backfill statuses environment do Animal where status nil where created at lt Time zone today update all status reserved endendNow let s check if it works rake animals backfill statusesThe rake task has been executed and the database values are updated accordingly That s it Our main scenario works as expected but there s still some room for improvements Let s take a look at what we can do to make our task more reliable ImprovementsThere are areas that we can potentially improve Display results in the console for visibilityEnsure data consistency with transactionsOptimize DB requestsIsolate the rake task codeAdd tests Display results in the console for visibilityAs you may have noticed the rake task above hasn t shown any output It can be a real problem because you don t know whether it was run successfully or not and you will spend much time trying to check it by yourself on production data Let s fix this problem frozen string literal true lib tasks animals backfill statuses rakenamespace animals do desc Update animal status to reserved for animals created before today task backfill statuses environment do puts Before running the rake task there were Animal where status reserved count animals in the reserved state Animal where status nil where created at lt Time zone today update all status reserved puts After running the rake task there are now Animal where status reserved count animals in the reserved state endendNow let s run the rake task rake animals backfill statuses gt Before running the rake task there were animals in the reserved state gt After running the rake task there are now animals in the reserved state With the updated code we display the number of animals in the reserved state both before and after running the rake task providing better visibility and ensuring that the task is executed successfully Ensure data consistency with transactionsWhat happens if some unexpected errors appear in the middle of data migration Right now we don t handle it Even if it s not critical for the example that we ve provided in general we should not forget to wrap such kind of data manipulation into a transaction to keep a consistent data state frozen string literal true lib tasks animals backfill statuses rakenamespace animals do desc Update animal status to reserved for animals created before today task backfill statuses environment do ActiveRecord Base transaction do Animal where status nil where created at lt Time zone today update all status reserved end endendBy adding the ActiveRecord Base transaction block we ensure that all the updates are executed as a single atomic operation If any error occurs during the data migration the transaction will be rolled back and the data will remain unchanged maintaining data consistency and integrity Optimize DB requestsWe have already handled this problem in our rake task but it s important to mention that we should use the optimal database solution if possible For example someone could write our task like this frozen string literal true lib tasks animals backfill statuses rakenamespace animals do desc Update animal status to reserved for animals created before today task backfill statuses environment do Animal where status nil where created at lt Time zone today each do animal animal update status reserved end endendThe following code will trigger an SQL update request for every animal from the list making it non optimal D T DEBUG Animal Load ms SELECT animals FROM animals WHERE animals status IS NULL AND created at lt D T DEBUG ↳lib tasks animals backfill statuses rake in block levels in lt main gt D T DEBUG TRANSACTION ms BEGIND T DEBUG ↳lib tasks animals backfill statuses rake in block levels in lt main gt D T DEBUG Animal Update ms UPDATE animals SET animals status reserved animals updated at WHERE animals id D T DEBUG ↳lib tasks animals backfill statuses rake in block levels in lt main gt D T DEBUG TRANSACTION ms COMMITD T DEBUG ↳lib tasks animals backfill statuses rake in block levels in lt main gt D T DEBUG TRANSACTION ms BEGIND T DEBUG ↳lib tasks animals backfill statuses rake in block levels in lt main gt D T DEBUG Animal Update ms UPDATE animals SET animals status reserved animals updated at WHERE animals id That s why it always makes sense to try to find a way to do it in a single DB operation as we did with the following code frozen string literal true lib tasks animals backfill statuses rakenamespace animals do desc Update animal status to reserved for animals created before today task backfill statuses environment do Animal where status nil where created at lt Time zone today update all status reserved endendThis code will trigger only one DB request Animal Update All ms UPDATE animals SET animals status reserved WHERE animals status IS NULL AND created at lt If there s no way to update something in a single DB request at least you should consider using batches as a good practice frozen string literal true lib tasks animals backfill statuses rakenamespace animals do desc Update animal status to reserved for animals created before today task backfill statuses environment do Animal where status nil where created at lt Time zone today find each do animal animal update status reserved end endendP S To see SQL logs from the rake task you can add the following code inside frozen string literal true lib tasks animals backfill statuses rakenamespace animals do desc Update animal status to reserved for animals created before today task backfill statuses environment do ActiveRecord Base logger Logger new STDOUT endend Isolate the rake task codeOne not so obvious problem that can occur when you have many rake tasks is a lack of encapsulation Let s take a look at the following two rake tasks and try to guess what can be wrong here rake animals task frozen string literal true lib tasks animals task rakenamespace animals do task task environment do puts message end def message Hello world from Task endendrake animals task frozen string literal true lib tasks animals task rakenamespace animals do task task environment do puts message end def message Hello world from Task endendNow let s run both of them rake animals task gt Hello world from Task rake animals task gt Hello world from Task Have you noticed That s not what we expected The second rake task overrode the method value from the first one And that s quite dangerous and unexpected if you were to use something like this rake animals task frozen string literal true lib tasks animals task rakenamespace animals do task task environment do query destroy all end def query Animal where status nil endendlib tasks animals task rake frozen string literal true lib tasks animals task rakenamespace animals do task task environment do query destroy all end def query Animal all endendAnd run rake animals taskYou would remove all your records instead of the desired subset How can we fix it We need to wrap our rake tasks into the Rake DSL class like this rake animals task frozen string literal true lib tasks animals task rakemodule Tasks module Animals class Task include Rake DSL def initialize namespace animals do task task environment do puts message end end end private def message Hello world from Task end end endendTasks Animals Task newrake animals task frozen string literal true lib tasks animals task rakemodule Tasks module Animals class Task include Rake DSL def initialize namespace animals do task task environment do puts message end end end private def message Hello world from Task end end endendTasks Animals Task newAnd let s execute rake animals task gt Hello world from Task rake animals task gt Hello world from Task Now everything works as expected Let s apply the same isolation for our backfill statuses rake task frozen string literal true lib tasks animals backfill statuses rakemodule Tasks module Animals class BackfillStatuses include Rake DSL def initialize namespace animals do desc Update animal status to reserved for animals created before today task backfill statuses environment do Animal where status nil where created at lt Time zone today update all status reserved end end end end endendTasks Animals BackfillStatuses newThat s it Add testsThe last thing that we re going to do to ensure quality is to add tests Let s see how we can test the rake tasks First of all we need to define some code to load our tasks spec support tasks rb frozen string literal trueRSpec configure do config Rails application load tasksendAnd include it in the spec rails helper rb spec rails helper rbrequire support tasks Then let s add our test spec tasks animals backfill statuses spec rb frozen string literal truerequire rails helper RSpec describe rake animals backfill statuses type task do subject Rake Task animals backfill statuses execute let expected output do lt lt TEXT Before running the rake task there were animals in the reserved state After running the rake task there are now animals in the reserved state TEXT end let animal create animal created at days ago status nil let animal create animal created at days ago status reserved let animal create animal created at days ago status another status let animal create animal created at days from now status nil before do animal animal animal animal end it update animal status to reserved for animals created before today do expect subject to change animal reload status from nil to reserved and output expected output to stdout expect animal reload status to eq reserved expect animal reload status to eq another status expect animal reload status to be nil endendThat s it Data Migration GemThe third option is to use the data migrate gem Let s add this gem to our project Gemfilegem data migrate And execute bundle installNow you can generate a data migration as you would generate a schema migration rails g data migration backfill animal statuses gt db data backfill animal statuses rbLet s add some code to the generated file to check if it actually works frozen string literal trueclass BackfillAnimalStatuses lt ActiveRecord Migration def up puts Test Data Migration end def down do nothing endendTo run the migration we need to use the following command rake data migrate orrake db migrate with dataAnd we get the following output BackfillAnimalStatuses migrating Test Data Migration BackfillAnimalStatuses migrated s This migration can be run only once So let s remove it and generate another one and add real code inside rails g data migration backfill animal statuses gt db data backfill animal statuses rbbHere s what we get after adding our business logic code db data backfill animal statuses rb frozen string literal trueclass BackfillAnimalStatuses lt ActiveRecord Migration def up Animal where status nil where created at lt Time zone today update all status reserved end def down do nothing endendAnd let s run rake data migrateHere s what we get Data BackfillAnimalStatuses migrating BackfillAnimalStatuses migrated s Basically data migration works by the same logic as scheme migration but instead of saving the last running migration version into the schema migrations table data migration saves the version into another table called data migrations It s important to mention that data migration should be irreversible in most cases but we don t want to raise an explicit error as it would prevent rollback for the scheme structure changes Instead we just leave the down method empty For this reason it would be better to design the migration in an idempotent way to be able to run it several times if possible Data Migration doesn t provide any additional advantages except for what we discussed above Therefore we still need to think about the problems that we solved for the rake task such as displaying output adding transactions optimizing DB requests etc Comparison of Rake Task and Data Migration GemLet s compare these two solutions and decide which one we should use and under what conditions When does a rake task fit better When you want to have the ability to select the exact time and day when you want to run it When you want to have the ability to select the platform where you want to run it for example staging or production When you want to run the same rake task several times When does the data migration gem fit better When you want to make sure that data will be added automatically and no one forgets to run something When schema migration order is important for you for example schema migration adds a new column and data migration backfills this column When you want to run it on all environments without additional effort When you need to run the migration only once So in general the rake task is much more flexible and testable and can cover the same tasks as the data migration but may require more effort Data migrations are much more strict but provide some automations and strict execution order that are connected with schema changes For example the data migration gem suits very well if you need to support some database schema restructurization and you don t want to miss the data during these changes For instance if you need to rename a column and you want to copy the values from the old one to a new one then set NULL false constraint to the new one and then completely remove the old one the data migration gem will make this process much easier compared to using a rake task On the other hand if the task is unrelated to database schema changes a rake task might be a more suitable choice It offers greater flexibility and testability making it easier to manage tasks that are not directly tied to schema alterations ConclusionThroughout this article we have explored various strategies for data migration generation and backfilling in a Rails application We have implemented and improved these strategies carefully considering their advantages and disadvantages Additionally we have compared two primary solutions the rake task and data migration gem and examined their suitability in different scenarios 2023-07-22 16:13:09
海外TECH DEV Community TCP/IP Network Model💻 : Deep Dive with example https://dev.to/tanishtt/tcpip-network-model-deep-dive-with-example-3j3o TCP IP Network Model Deep Dive with exampleSo let s get started The TCP IP model is a conceptual framework used for understanding and implementing network communication protocols It stands for Transmission Control Protocol Internet Protocol and is the foundation of the modern internet and most networks The TCP IP model consists of four layers each serving a specific purpose Let s break down each layer Application Layer The Application Layer is the top layer of the TCP IP model and it is responsible for providing network services directly to user applications These services allow software applications to communicate over the network Examples of applications that operate at this layer include web browsers email clients and file transfer programs Some common protocols at this layer are HTTP Hypertext Transfer Protocol for web browsing SMTP Simple Mail Transfer Protocol for email and FTP File Transfer Protocol for file transfers This layer is where user data is generated or consumed and it formats the data in a way that is understandable by the application Transport Layer The Transport Layer is responsible for end to end communication between devices on the network It ensures reliable and orderly data delivery error detection and flow control The two most commonly used protocols at this layer are TCP Transmission Control Protocol It provides reliable connection oriented communication TCP ensures that data packets are delivered without errors and in the correct order It also handles flow control to prevent overwhelming the receiving device with too much data at once UDP User Datagram Protocol It is a simpler connectionless protocol that offers minimal error checking and no guaranteed delivery UDP is used for applications where speed is more critical than reliability such as real time video streaming or online gaming When data is passed down from the Application Layer it is divided into smaller chunks called segments TCP or datagrams UDP for transmission over the network Internet Layer The Internet Layer is responsible for addressing routing and forwarding data packets between different networks It ensures that data packets reach their destination across the internet or any interconnected network The primary protocol used at this layer is the Internet Protocol IP It assigns a unique IP address to each device connected to the network allowing routers to direct data packets to the correct destination based on these addresses When data segments or datagrams arrive from the Transport Layer the Internet Layer adds the source and destination IP addresses forming an IP packet Link Layer Network Access Layer The Link Layer is the lowest layer of the TCP IP model and is responsible for the physical transmission of data between nodes on the same network It deals with hardware addressing error detection and the reliable transmission of frames data packets within a local network segment This layer has different implementations depending on the type of physical network being used such as Ethernet Wi Fi or PPP Point to Point Protocol At this layer the IP packet is encapsulated in a data frame with a specific hardware address known as the MAC address of the receiving device on the local network So concluding the TCP IP model provides a framework for communication between devices over a network It starts from user applications at the Application Layer which generate data This data is then broken down into segments or datagrams at the Transport Layer assigned IP addresses at the Internet Layer and finally transmitted in data frames with MAC addresses at the Link Layer As data flows through the layers in the reverse order it gets reassembled error checked and delivered to the appropriate application on the receiving device Let s understand with an example Suppose I connected my desktopto internet airtel Opened browser Typed www facebook com and Hit enter The following steps will happen in TCP IP model Application Layer You open your web browser e g Chrome Firefox on your desktop The web browser operates at the Application Layer of the TCP IP model You type www facebook com in the address bar and hit Enter This action triggers the web browser to initiate a request to access the Facebook website Transport Layer At the Transport Layer the web browser uses the HTTP Hypertext Transfer Protocol to send the request to access the Facebook website The request is divided into smaller segments for transmission over the internet TCP Transmission Control Protocol is used for HTTP requests as it ensures reliable data delivery The browser establishes a TCP connection with the web server that hosts the Facebook website Internet Layer The TCP segments are passed down to the Internet Layer where the Internet Protocol IP comes into play The IP layer adds the source and destination IP addresses to the segments forming IP packets In this step the browser s request for the Facebook website is encapsulated in an IP packet The destination IP address is determined based on the domain name www facebook com which is resolved into an IP address using DNS Domain Name System lookup Link Layer Network Access Layer The IP packets now move down to the Link Layer where the appropriate network interface is determined Since you are connected to the internet through Airtel the Link Layer prepares the data frames for transmission over the Airtel network The data frames are encapsulated with source and destination MAC addresses The destination MAC address is typically the MAC address of the Airtel router gateway that connects your desktop to the internet Transmission The prepared data frames are sent over the local network your home network to the Airtel router and from there they are forwarded to the Airtel network infrastructure Routing Within the Airtel network routers examine the destination IP address of the packets and use their routing tables to determine the next hop the next router to forward the data towards its destination Internet Backbone The IP packets traverse through multiple routers and network devices in the Airtel network and possibly other networks in the global internet backbone These devices work together to route the packets closer to the destination server hosting Facebook Reaching the Facebook Server Eventually the IP packets reach the data center where the Facebook website is hosted The data center s routers further direct the packets to the specific server that hosts the Facebook website Server Response The Facebook server processes your request generates the webpage content and prepares a response to send back to your desktop Transmission Back to Your Desktop The server response follows the same path in reverse going through the Internet Layer and Link Layer The response data travels through the Airtel network your home network and finally arrives at your desktop Link Layer Decapsulation At your desktop s Link Layer the data frames are decapsulated removing the Link Layer header and leaving the IP packets Internet Layer Decapsulation The IP packets move up to the Internet Layer where the IP headers are removed leaving the Transport Layer segments Transport Layer Decapsulation At the Transport Layer the TCP segments are reassembled into the original HTTP response from the Facebook server Application Layer Decapsulation The web browser at the Application Layer processes the HTTP response interprets the webpage content and displays the Facebook website on your browser window This entire process happens rapidly behind the scenes enabling you to access websites and communicate over the internet seamlessly thanks to the TCP IP network model Until next time If you liked the article do hit like 2023-07-22 16:06:27
Apple AppleInsider - Frontpage News Beats Studio Pro versus AirPods Max -- compared https://appleinsider.com/inside/airpods-max/vs/beats-studio-pro-versus-airpods-max----compared?utm_medium=rss Beats Studio Pro versus AirPods Max comparedApple has released the Beats Studio Pro headphones with Active Noise Cancellation and other features Here s how they compare to the AirPods Max which hasn t been updated since Beats Studio Pro versus AirPods MaxTaking inspiration from the Beats Studio headphones the Beats Studio Pro ships with leather ear cushions and metal sliders as launched by Apple on July Inside each earcup a specially designed mm driver aims to deliver exceptional audio quality with minimal distortion even at high volume levels Read more 2023-07-22 16:32:53
海外TECH Engadget ExpressVPN review: Our favorite for gaming and streaming https://www.engadget.com/vpn-review-expressvpn-2023-gaming-streaming-160052492.html?src=rss ExpressVPN review Our favorite for gaming and streamingExpressVPN has become a household name or at least as close to one as a VPN is likely to get taking over mainstream advertisements on sites like YouTube On our roundup of the nine top providers in June it came out tops for streaming services frequent travel and gaming But notably it wasn t the overall best falling short on areas like security and user friendliness There are three main VPN use cases on top of general security geoblocking streaming and gaming That means my tests looked like watching Shrek on the clock by using a VPN to access Canadian Netflix from my US based home office where the ogre movie isn t currently available ExpressVPN was easy to sign up for download and use but compared to the other services it didn t wow me Competitors like ProtonVPN for example had easier ways to sign in across platforms But an ExpressVPN subscription does come with a password manager to store and autofill credentials across websites That s a plus in a world where complex passwords are crucial to keeping your accounts secure The best VPNs stay out of your way and you ll barely even notice they re running But one oddity was that ExpressVPN internet speeds outperformed our baseline internet speed measures The service is likely circumventing traffic shaping by the internet service provider or a similar anomaly because every other VPN will hurt internet speed in some way But it did successfully mask the IP address and pass the DNS and WebRTC leak tests as privacy measures ExpressVPNIt was also easy to access geo blocked content using ExpressVPN with little to no buffering There were some loading delays that only lasted a few seconds when I tried to stream the news on YouTube using ExpressVPN but no lag came up after that Finally ExpressVPN passed the gaming test by avoiding lag and maintaining a normal loading time Although it was a pretty basic test where I logged into online game Slither io from a UK based VPN to play the worm eating competition with international users Surfing the web with ExpressVPN was just as easy as being online without it With ExpressVPN a ping test measured how long data takes to travel from the computer to the server and back at milliseconds versus milliseconds with no VPN turned on ExpressVPN s biggest perk is that it supports up to five devices at once That means I could conduct all tests simultaneously and still had no slowdown That s great for sharing it with a family or folks that like to game watch TV and scroll on their phone at the same time It s the main reason ExpressVPN landed as our top choice for streaming and gaming The connectivity was solid it had a wide range of servers in countries and provided clear instructions on configuration for any device But security wise I found myself wanting more ExpressVPN is based in the British Virgin Islands which the company touts because the territory lacks any foreign intelligence operations and does not participate in Eyes intelligence sharing agreements But it is owned by Kape Technologies which also owns competitor CyberGhost and Kape has a problematic history that includes spreading malware Not only that in the Department of Justice charged ExpressVPN CIO Daniel Gericke for cyberspying activities on behalf of the UAE ExpressVPN stood by the CIO in a blog post But it s not all bad ExpressVPN publicly shared security audits of its mobile apps protocol and desktop apps last year That s a win for security transparency Still a Consumer Reports study found that ExpressVPN didn t support multifactor authentication did not meet brute force mitigation checks and retained some data even after an account was terminated ExpressVPN did however exceed industry standards in protections against unauthorized access implement a vulnerability disclosure program and said it would not pursue legal action against security researchers That means when it comes to security standards and practices ExpressVPN as a company has a few too many misses and not enough hits I recommended ExpressVPN as our top choice for gamers frequent travelers and heavy users of streaming services because it lets users access a wide range of locations from a variety of devices with high speed connections and no lag With options to configure directly to routers and gaming consoles it s a solid choice for people that put a lot of strain on their ISPs Still there are better VPNs for the security minded or those who want something more affordable This article originally appeared on Engadget at 2023-07-22 16:00:52
ニュース BBC News - Home Thousands flee homes and hotels on Rhodes as fires spread https://www.bbc.co.uk/news/world-europe-66279520?at_medium=RSS&at_campaign=KARANGA chief 2023-07-22 16:39:12
ニュース BBC News - Home Tour de France 2023: Tadej Pogacar salvages pride by winning stage 20 https://www.bbc.co.uk/sport/cycling/66278990?at_medium=RSS&at_campaign=KARANGA france 2023-07-22 16:13:49
ニュース BBC News - Home The Ashes: England's Joe Root removes Australia's Marnus Labuschagne on 111 https://www.bbc.co.uk/sport/av/cricket/66279032?at_medium=RSS&at_campaign=KARANGA The Ashes England x s Joe Root removes Australia x s Marnus Labuschagne on England s Joe Root removes Australia batter Marnus Labuschagne on on day four of the fourth Ashes Test at Old Trafford 2023-07-22 16:01:28

コメント

このブログの人気の投稿

投稿時間:2021-06-17 22:08:45 RSSフィード2021-06-17 22:00 分まとめ(2089件)

投稿時間:2021-06-20 02:06:12 RSSフィード2021-06-20 02:00 分まとめ(3871件)

投稿時間:2021-06-17 05:05:34 RSSフィード2021-06-17 05:00 分まとめ(1274件)