python |
Pythonタグが付けられた新着投稿 - Qiita |
PythonでDisjoint Sparse Tableを書いたよ。 |
https://qiita.com/hyouchun/items/65a7948b5328c810a5f3
|
classdisjointsparset |
2023-07-23 20:11:03 |
Docker |
dockerタグが付けられた新着投稿 - Qiita |
【備忘録】Vite + Vueでのフロントエンド開発の第一歩 |
https://qiita.com/jhon-manjirou/items/fe4955287723c4634b65
|
vitevue |
2023-07-23 20:11:32 |
技術ブログ |
Developers.IO |
EC2 생성 시 You have requested more vCPU capacity than your current vCPU limit of 0 allows for the instance bucket that the specified instance type belongs to 에러 해결 |
https://dev.classmethod.jp/articles/jw-resolving-vcpu-capacity-errors/
|
EC 생성시You have requested more vCPU capacity than your current vCPU limit of allows for the instance bucket that the specified instance type belongs to 에러해결안녕하세요클래스메소드김재욱 Kim Jaewook 입니다 이번에는EC 생성시You have requested more vCPU capacity than your current vCPU |
2023-07-23 11:55:48 |
海外TECH |
DEV Community |
Object-Oriented Programming (OOP) in JavaScript |
https://dev.to/diwakarkashyap/object-oriented-programming-oop-in-javascript-cfm
|
Object Oriented Programming OOP in JavaScriptLet s start a journey into the world of programming with a fun analogy Imagine you re at a bustling train station There are different types of trains each with its unique features and functions This is quite similar to how OOP in JavaScript works What s OOP Object Oriented Programming OOP is like a train system Just like a train is made up of different coaches an application in OOP is composed of different objects These objects are instances of classes which can be considered as the blueprint for creating objects Objects and Their Special FeaturesIn JavaScript objects are like special trains They carry values in properties and can perform tasks with methods For example let s take a train object let train name Express speed capacity start function console log this name has started moving stop function console log this name has stopped Here name speed and capacity are properties of the train and start and stop are the methods The Blueprint ClassesClasses are like blueprints used to create trains Just like each train follows a blueprint to have an engine bogies and certain speed each object in JavaScript follows a class Here s an example of a Train class in JavaScript class Train constructor name speed capacity this name name this speed speed this capacity capacity start console log this name has started moving stop console log this name has stopped To create a new train we can now use let expressTrain new Train Express And just like a real train our expressTrain can now start and stop expressTrain start expressTrain stop The Special Trains InheritanceLet s say we now want to add a special type of train a Bullet Train which has all the properties of a train but can also tilt We can achieve this through inheritance In JavaScript we use the keyword extends to inherit from another class Here s how we can create a BulletTrain class that extends the Train class class BulletTrain extends Train constructor name speed capacity super name speed capacity tilt console log this name is tilting Now we can create a BulletTrain that can do everything a normal Train can do and also tilt let bulletTrain new BulletTrain Shinkansen bulletTrain start bulletTrain stop bulletTrain tilt And there you go That s a simple and friendly introduction to Object Oriented Programming in JavaScript seen through the lens of a bustling train station Just remember objects are the special trains classes are the blueprints for these trains and inheritance lets us create new types of special trains Choo Choo Happy coding |
2023-07-23 11:53:57 |
海外TECH |
DEV Community |
How to Choose the Right MQTT Data Storage for Your Next Project |
https://dev.to/reductstore/how-to-choose-the-right-mqtt-data-storage-for-your-next-project-1031
|
How to Choose the Right MQTT Data Storage for Your Next ProjectChoosing the right database can be overwhelming trust me I know Photo by Jan Antonin Kolar target blank on Unsplash target blank Since joining ReductStore s project I ve been exploring alternative solutions to get a better understanding about how the project fits into current echosystem I found all kind of databases from the most popular ones to the most obscure ones To give you some context we will look at solutions to store data from IoT devices e g sensors cameras etc that commonly use MQTT to communicate with each other MQTT stands for Message Queuing Telemetry Transport and is a lightweight messaging protocol designed to be efficient reliable and scalable making it ideal for collecting and transmitting data from sensors in real time Why is this important when choosing a database Well MQTT is format agnostic but it works in a specific way We should therefore be aware of its architecture how it works and its limitations to make the right choice This is what this article is about we will try to cut through the fog and explore some key factors to consider when selecting the right option Let s get started Brief explanation of MQTT and its use in IoT projectsImportance of choosing the right data storage for MQTTFactors to consider when choosing MQTT data storageTypes of database options available Brief explanation of MQTT and its use in IoT projectsMQTT is a publish subscribe messaging protocol that allows devices to send and receive messages over a network It is particularly well suited for IoT projects due to its lightweight nature low power consumption and support for unreliable networks In an MQTT based system devices known as publishers publish messages to a central broker which then distributes these messages to other devices known as subscribers that have subscribed to specific topics The subscriber can then process the message and take appropriate action For example a sensor may publish a message containing its current temperature which is then received by a subscriber that has subscribed to the topic temperature Example of Pub Sub architecture image by author MQTT is easy to use and keeps subscription and publishing tasks separate To get info from a device you don t need to know all its details like its address or password You just need to connect to a middleman called a broker and know the topic s name This pattern offers many advantages for IoT over other protocols such as HTTP which requires servers and clients to be aware of each other s details and communicate directly Importance of choosing the right data storage for MQTTAn important aspect to consider is the real time aspect of IoT projects Note that MQTT messages are not directly time stamped but you will often set the time information in the payload This is done by the publisher and you may decide to use the time you want For example one can transmit the time of the edge device when the message is created or the time of the sensor when the data is collected More importantly the storage solution must be able to handle the high throughput of MQTT and store messages in chronological order A time series database might seem an obvious choice to store data in chronological order as it allows for efficient storage and retrieval of messages indexed by time Another aspect to consider is the type of data that can be transmitted via MQTT which is pretty much anything The MQTT protocol is format agnostic meaning that it does not specify how data should be formatted This allows for a wide variety of data types to be transmitted via MQTT including text images or audio The only requirements are that the data must be formatted either as a string UTF encoded or as a byte stream and that the payload size does not exceed MB which is pretty large like a K video for seconds In other words MQTT can be used to transmit all kind of data including text images video or audio as long as it is formatted as a string or binary stream and lighter than MB Factors to consider when choosing MQTT data storageWhen selecting an MQTT data storage solution there are several factors to consider PerformanceScalability Reliability SecurityCompatibilityCost Performance speed and efficiency in processing and retrieving dataPerformance is an obvious one The storage should be able to process and retrieve data efficiently and quickly with a low response time which implies that the database s read and write speeds along with the network latency are minimized When storing pictures from a camera or high frequency sensor data e g accelerometers performance becomes even more critical Cameras often generate high resolution images that can be pretty large and an average accelerometer can easily produce around measurements per second kHz For example let s consider a smart surveillance system that uses MQTT to transmit images captured by security cameras In this scenario the storage solution needs to be able to handle a continuous stream of time stamped images in real time This requires not only fast write speeds but also efficient compression techniques to reduce the size of each image without compromising its quality And when it comes to retrieving these pictures for analysis or review purposes speed is crucial You should be able to easily access and fetch images from any specified time interval from the database installed on the edge device and from the database deployed on the cloud Scalability ability to handle large amounts of data and increasing workloadWhile this might be expected for cloud databases how does it apply to edge databases Since MQTT applications frequently produce a significant amount of data from various devices and sensors the edge storage should be capable of managing this high throughput and having a solid quota policy when you run out of disk and replication methods to backup data in the cloud A scalable system should also include considerations such as handling increasing workloads accommodating additional devices or sensors and supporting horizontal scaling by adding more edge devices or storage nodes when needed Reliability ensuring data integrity and availability without loss or corruptionTo guarantee the reliability and accessibility of the stored data without any loss or damage it is crucial to select a storage solution that incorporates appropriate measures These measures should be capable of addressing possible failures such as power outages or network interruptions and safeguarding against data loss or compromise One popular solution is to replicate data Replication involves creating duplicate copies of the data and storing them in multiple locations or servers This redundancy ensures that even if one server fails the data can still be accessed from another one minimizing the risk of data loss or corruption Security protection against unauthorized access attacks and breachesMQTT data can contain sensitive information such as device telemetry user behavior or images making it essential to protect against unauthorized access attacks and breaches How can you know if a database is trustworthy I believe the focus should be on the open source designation and its associated community The open nature of a database allows for greater transparency in terms of vulnerabilities and fixes as the code is available for scrutiny by anyone This means that the community can quickly identify and address any potential security issues Open source databases often have a large user base and community support This means that many eyes are looking out for potential threats or vulnerabilities leading to quicker detection and resolution of any issues Compatibility integration with other systems protocols or analytics toolsWhen considering the compatibility factor you can think about the other systems or protocols that your database needs to integrate with For example if you are also using a cloud platform like AWS or Azure you will want to ensure that your chosen edge database can integrate with these platforms In addition if you plan on performing analytics on your MQTT data you will need a solution that can easily integrate with popular tools such as Grafana Apache Kafka or Apache Spark Cost affordability and cost effectivenessCost is another important factor to consider when selecting a storage solution Overall the cost of a database can be broken down into two categories upfront costs and ongoing costs Upfront costs include the initial purchase price of the database along with any additional hardware or software required to run it Ongoing costs include maintenance fees support fees and any other recurring expenses associated with using the database Types of database options availableThere are several types of popular database options available for IoT such as time series NoSQL or relational SQL databases Time series databasesTime series databases are specifically designed to handle time stamped data making them an ideal choice for storing MQTT data The whole idea one sentence Time series databases optimize storage and retrieval performance for time series data by efficiently indexing the data They provide built in functions and query capabilities that are well suited for analyzing and visualizing information stored in chronological order They often offer features like downsampling and retention policies to manage large volumes of historical data efficiently Examples of popular time series databasesSome popular time series databases that are commonly used for storing MQTT data include InfluxDB target blank an open source database that provides high performance storage and retrieval of time stamped data with a SQL like query language You can store numbers integer or floating point values boolean values or text strings with a limit of KB on v target blank that s times less than MQTT s payload capacity TimescaleDB target blank an extension of PostgreSQL that adds time series capabilities to the relational database model It provides scalability and performance optimizations for handling large volumes of time stamped data while maintaining the flexibility of a relational database Prometheus target blank another widely used open source monitoring system and time series database designed for collecting metrics from various sources It offers powerful querying capabilities alerting functionalities and easy integration with Grafana an open source visualization and monitoring tool ReductStore for time series blob dataReductStore target blank is a unique time series database that offers a specialized database for storing MQTT data in the form of time series blob data Unlike traditional time series databases that store structured data ReductStore allows for storing unstructured data blobs along with their associated timestamps This makes it suitable for scenarios where MQTT data includes large textual information images audio high frequency sensors e g accelerometers videos or other types of binary files With ReductStore you can efficiently store and retrieve these blob data while benefiting from the performance optimizations and query capabilities provided by a time series database NoSQL databasesNoSQL which stands for not only SQL encompasses a broader category of databases that are not limited to traditional SQL Structured Query Language databases NoSQL databases are often famous for storing MQTT data due to their flexibility and scalability These databases can handle large volumes of structured semi structured and unstructured data making them ideal for storing diverse information For instance Key value stores are uncomplicated databases that store information in the format of key value pairs similar to a dictionary Document Databases Organizes semi structured data as documents Column oriented Databases Organise data by columns making it efficient to find specific information Graph Databases Optimise for storing and querying highly connected data Wide Column Stores Store vast amounts of structured and semi structured data Examples of popular NoSQL databasesSome popular NoSQL databases commonly used include MongoDB Apache Cassandra and Amazon DynamoDB MongoDB target blank is a document oriented database with high scalability and flexibility for handling unstructured or semi structured data It offers rich querying capabilities indexing options and support for distributed data storage with a technique called “sharding Apache Cassandra target blank is a highly scalable and fault tolerant database that can handle large volumes of data across multiple nodes or clusters It provides fast read and write operations making it suitable for real time analytics or applications with high throughput requirements Amazon DynamoDB target blank is a NoSQL database which is fully managed Meaning that it offers automatic scaling low latency access to data durability guarantees and integration with other AWS services Blob storageBlob stands for Binary Large Object and can be stored in specific services specialized in storing unstructured binary data such as files images videos and backups Blob Storage is not typically considered part of the NoSQL database category as it does not provide advanced querying capabilities or data modeling options in traditional NoSQL databases Examples of popular blob storageSome popular blob storage options that can be used for storing unstructured binary data include MinIO Google Cloud Storage Azure Blob Storage and Amazon S Simple Storage Service If you need to install a blob storage on your edge device you should consider MinIO target blank It is an open source high performance object storage system designed to store any kind of unstructured data usually heavy in memory such as photos videos backups and container images Then there are well known cloud service providers Azure Blob Storage target blank is a scalable and highly available object storage service provided by Microsoft Azure They offer various storage tiers so you can optimize cost and performance based on your requirements They also provides features like lifecycle management versioning and data encryption Google Cloud Storage target blank is a globally distributed object storage service offered by Google Cloud Platform They provide trustworthy and scalable databases for storing large amounts of blob data They also provide a way to optimize cost and performance with different storage classes and pricing options Amazon S target blank is a widely used object storage service thanks to its global infrastructure and integration with other AWS services They also provide features like lifecycle policies versioning and server side encryption Relational databasesRelational databases such as MySQL are another option to consider for storing MQTT data in IoT projects These databases provide a structured and organized approach to data storage making them suitable for projects that require strong data consistency and complex relationships between different entities Relational databases use tables with predefined schemas to store data allowing for efficient querying and indexing They offer features like transactions which means that you can perform multiple operations on the database as a single unit of work and each transaction follows a set of properties known as ACID Atomicity Consistency Isolation Durability target blank to ensure that the data remains consistent even if there is a failure or a crash Examples of popular relational database management systemsSome popular relational database management systems RDBMS that are commonly used include MySQL PostgreSQL or Oracle Database MySQL target blank is an open source RDBMS that is widely used in IoT projects due to its simplicity reliability and scalability It offers strong data consistency and supports efficient querying using SQL PostgreSQL target blank is another popular open source RDBMS known for its robustness extensibility and support for advanced features like JSON data types and spatial indexing Oracle Database target blank is a commercial RDBMS with a proven track record in handling multiple databases It offers advanced security features and analytics capabilities ConclusionChoosing the right storage option for your project is crucial for ensuring efficient data management and analysis Factors such as the type of data scalability needs and project requirements should be aligned with your choice of database For example if your project involves collecting real time sensor data at low frequencies from various devices spread across different locations In this scenario you would require a time series database like InfluxDB TimescaleDB or Prometheus that can handle high volumes of time stamped data On the other hand if your project involves gathering real time sensor data at a high frequency or capturing time stamped images from a camera you would probably need a solution such as ReductStore NoSQL databases like MongoDB Amazon DynamoDB or Apache Cassandra are ideal for handling large volumes of unstructured or semi structured data with high scalability and real time processing capabilities Blob storage options such as MinIO Azure Blob Storage Google Cloud Storage and Amazon S are good options for storing large amounts of blob data including multimedia files like audio images or videos Finally relational databases like MySQL Oracle Database or PostgreSQL suit projects requiring strong data consistency complex relationships between entities and advanced querying capabilities Thanks for reading I hope this article will help you choose the most appropriate database for your IoT project If you have any questions or comments please feel free to reach out |
2023-07-23 11:52:02 |
海外TECH |
DEV Community |
Django Caching 101: Understanding the Basics and Beyond |
https://dev.to/pragativerma18/django-caching-101-understanding-the-basics-and-beyond-49p
|
Django Caching Understanding the Basics and BeyondIn today s world of web development website speed and performance are paramount Users expect websites to load quickly and provide a seamless user experience Slow loading pages can lead to frustrated users decreased engagement and ultimately lost business opportunities To overcome thesechallenges web developers employ various optimization techniques and one of the most effective strategies is caching Caching the process of storing frequently accessed data in a temporary storage layer can significantly boost the performance of your Django application by reducing database queries network round trips and overall processing time By serving cached content instead of generating it from scratch you can drastically improve the response times of your web pages and relieve the load on your backend infrastructure This article aims to demystify caching in Django empowering developers of all levels to harness its full potential Whether you re an intermediate Django developer or an experienced practitioner looking to fine tune your applications this article will walk you through the fundamentals strategies and best practices of caching We will begin by exploring the key concepts behind caching and its various benefits Understanding how caching works and the different types of caching available in Django will provide a solid foundation for implementing effective caching solutions Next we ll dive into practical examples and demonstrate step by step approaches for integrating caching into your Django applications From simple in memory caching to advanced techniques using database or external caching systems we ll cover a range of scenarios and help you decide which approach is best suited for your specific use cases So let s get started Introduction to caching and its benefitsCaching is a technique used in computer systems to store frequently accessed or computed data in a temporary storage location known as a cache The primary purpose of caching is to improve system performance and reduce the time and resources required to fetch or generate data When a system or application needs certain data it first checks the cache If the data is found in the cache it can be retrieved quickly without the need for expensive operations such as disk reads or network requests This significantly reduces latency and improves overall system responsiveness Imagine you re a librarian working in a very busy library with countless books and eager readers Every time a reader asks for a specific book you have two options either rush to the bookshelves find the book and bring it back or take a shortcut and keep a small selection of frequently requested books at your desk This selection of books represents the cache By having these popular books readily available you can quickly satisfy the majority of reader requests without having to navigate the entire library each time The cache saves time and effort by storing frequently accessed books within arm s reach providing a speedy and efficient service Hence just as the librarian optimizes the book retrieval process caching optimizes data access resulting in faster response times reduced workload and an overall smoother experience for users It is a powerful technique that offers numerous benefits such as Improved Performance By storing frequently accessed data closer to the application or user caching reduces the time required to fetch or generate the data This leads to faster response times and a more responsive user experience Caching is particularly beneficial for applications that involve complex computations database queries or external API calls Reduced Load on Resources Caching helps alleviate the load on system resources such as servers databases or APIs By serving cached data instead of recalculating or fetching it repeatedly caching reduces the number of resource intensive operations required This leads to better resource utilization and improved scalability allowing systems to handle higher loads without compromising performance Lower Latency Caching significantly reduces the latency involved in fetching data from slower or remote sources such as disk drives or network servers By keeping frequently accessed data in a cache closer to the application or user the data can be retrieved with minimal delay resulting in faster response times and smoother user interactions Cost Efficiency Caching can lead to cost savings by reducing the need for expensive resources For example caching can help minimize database load allowing organizations to use lower cost database instances or reduce the number of required servers By optimizing resource utilization caching helps organizations achieve better cost effectiveness Enhanced Scalability Caching improves the scalability of systems by reducing the load on critical resources With caching systems can handle higher traffic volumes without sacrificing performance This scalability is particularly important for high traffic websites web applications or services that require real time data processing To check the difference in code performance before and after implementing caching consider the following example Before implementing caching from django shortcuts import renderfrom myapp models import Productfrom django utils import timezonedef product list request Start measuring time start time timezone now products Product objects all response render request product list html products products Calculate time taken end time timezone now elapsed time end time start time response X Elapsed Time elapsed time total seconds return responseIn the above example we ve added code to measure the time taken to process the request We capture the start time before executing the database query and rendering the template After the response is generated we calculate the elapsed time and include it in the response headers as X Elapsed Time After implementing caching from django shortcuts import renderfrom django views decorators cache import cache pagefrom myapp models import Productfrom django utils import timezone cache page Cache the response for minutesdef product list request Start measuring time start time timezone now products Product objects all response render request product list html products products Calculate time taken end time timezone now elapsed time end time start time response X Elapsed Time elapsed time total seconds return responseIn the updated example we ve applied the cache page decorator to enable caching for the product list view With the time measurement included in the response headers you can use the Django Debug Toolbar to inspect the X Elapsed Time value and compare the response time before and after implementing caching You should observe a significant reduction in response time for subsequent requests within the cache duration indicating the improved performance achieved through caching Now that we have a clear understanding of caching and its benefits let s delve into how caching works with Django Understanding the Django caching framework and its componentsThe Django caching framework is a built in feature of the Django web framework that provides tools and functionalities to implement caching strategies in Django applications It offers a comprehensive and flexible system for caching data at various levels including template fragment caching view caching and low level caching The Django caching framework consists of the following key components Cache BackendsDjango supports various cache backends which determine how and where the cached data is stored These backends include in memory caching file based caching database caching and external caching systems like Redis or Memcached Developers can choose the appropriate backend based on their specific requirements The CACHES setting in Django s configuration determines the cache backend to use and its configuration options Here s an example of configuring the cache backend to use the Memcache settings pyCACHES default BACKEND django core cache backends memcached MemcachedCache LOCATION Let s break down the different components of the cache configuration default This is the name of the cache backend Django supports multiple cache backends so you can define and use different cache configurations with distinct names BACKEND This specifies the cache backend to use You need to provide the fully qualified name of the cache backend class Django provides built in cache backends such as django core cache backends memcached MemcachedCache or django core cache backends filebased FileBasedCache Alternatively you can define and use custom cache backends LOCATION This indicates the location or identifier for the cache The value can vary depending on the cache backend being used For example for in memory caching you can specify a unique identifier or suffix while for filesystem caching you can provide the path to the cache directory Cache APIThe Cache API provides a simple and consistent interface for interacting with the cache backend It offers methods for storing retrieving and deleting cached data Developers can access the cache object through the cache module in Django Here are some commonly used methods cache set key value timeout Stores the value in the cache with the specified key and optional timeout cache get key Retrieves the value from the cache associated with the given key cache delete key Deletes the cached data associated with the given key Here are a few examples from django core cache import cache Setting a value in the cachecache set my key my value timeout Getting a value from the cachemy value cache get my key Deleting a value from the cachecache delete my key Template Fragment CachingDjango allows for fragment level caching within templates which is useful for caching specific parts of a template that are expensive to render This caching is achieved using the cache template tag By wrapping the dynamic content with this tag Django will cache the rendered HTML output reducing the need for repetitive computations Here s an example load cache cache my key lt Expensive or dynamic content here gt endcache In this example the content inside the cache block will be cached for seconds using the specified my key Subsequent requests within the cache timeout will retrieve the cached content instead of re rendering it View CachingDjango provides the ability to cache entire views or specific parts of views This is particularly useful when dealing with views that require heavy processing or involve database queries Developers can use decorators like cache page or cache control to cache the entire view or control caching based on specific criteria Here s an example of caching a view using the cache page decorator from django views decorators cache import cache page cache page Cache the view for minutesdef my view request View logic hereHere s an example of using the cache control decorator in Django from django views decorators cache import cache control cache control public True max age def my view request View logic hereIn the above example we use the cache control decorator to apply cache control directives to the HTTP response generated by the my view function The cache control decorator accepts various parameters to control caching behavior In this case we set public True to indicate that the response can be cached by public caches We also set max age to specify that the response can be considered fresh for up to seconds hour Cache MiddlewareDjango includes cache middleware that can be added to the middleware stack This middleware intercepts requests and checks if a cached version of the response exists If available it serves the cached response bypassing the entire view processing and database queries Here s an example of implementing cache middleware in Django from django middleware cache import CacheMiddlewareclass MyCacheMiddleware CacheMiddleware def process request self request Custom logic to determine if the request should be cached if self should cache request request return super process request request return None def should cache request self request Custom logic to determine if the request should be cached return request method GET and not request user is authenticated def process response self request response Custom logic to modify the response before caching response super process response request response Additional processing if needed return responseIn the above example we created a custom cache middleware by subclassing CacheMiddleware which is a built in Django middleware class responsible forhandling caching We override the process request method to implement our custom logic to determine if the request should be cached or not In this case we check if the request method is GET and the user is not authenticated You can modify thislogic according to your specific caching requirements If the request meets the conditions for caching we call thesuper process request request method to proceed with the default caching behavior provided by CacheMiddleware This will check if a cached response is available for the current request and return it if found bypassing further processing If the request does not meet the caching conditions we return None to bypass the caching process and allow the request to continue down the middleware chain Different types of caching supported by DjangoDjango supports various types of caching to improve the performance of web applications Here are different types of caching supported by Django In memory cachingDjango provides built in support for in memory caching which stores cached data in the server s memory This type of caching is suitable for storing frequently accessed data that doesn t change often such as static content configuration settings or small computed values Django s default cache backend django core cache backends locmem LocMemCache uses in memory caching settings pyCACHES default BACKEND django core cache backends locmem LocMemCache LOCATION unique suffix Here are some pros and cons of using in memory caching Pros Fast access and retrieval of cached data since it is stored in memory Well suited for caching small to medium sized datasets Easy to configure and doesn t require additional dependencies Cons Limited storage capacity since it relies on available memory Data is not persistent and will be lost upon server restart Filesystem cachingDjango also supports caching data on the filesystem Cached data is stored as files in a specified directory on the server s filesystem Filesystem caching is useful when you want to persist cached data even after restarting the server and have a moderate amount of data to cache It can be effective for caching static files or relatively static database queries The django core cache backends filebased FileBasedCache backend is used for filesystem caching settings pyCACHES default BACKEND django core cache backends filebased FileBasedCache LOCATION path to cache directory Here are some pros and cons of using filesystem caching Pros Data is persistent and survives server restarts Suitable for caching larger datasets compared to in memory caching Cons Slower access and retrieval of cached data compared to in memory caching Filesystem operations can introduce latency especially with a large number ofcache files Database cachingDjango allows caching data in a database table This type of caching is suitable for applications where you want to leverage the database for storing and retrieving cached data It is beneficial when you need to cache dynamic data that is frequently accessed and updated It is suitable for scenarios where multiple application instances share the same cache making it a good choice for distributed environments The django core cache backends db DatabaseCache backend is used for database caching settings pyCACHES default BACKEND django core cache backends db DatabaseCache LOCATION my cache table Here are some pros and cons of using database caching Pros Data is persistent and can be shared across multiple instances of theapplication Can handle larger datasets compared to in memory and filesystem caching Cons Slower access and retrieval of cached data compared to in memory andfilesystem caching Can introduce additional database load Cache backend with MemcachedDjango supports using Memcached as a cache backend Memcached is a high performance distributed memory caching system that can be used to store cached data across multiple servers It is recommended when you need a high performance caching solution that can handle large datasets and scale horizontally It is well suited for caching frequently accessed data and can be beneficial in environments with heavy read traffic The django core cache backends memcached MemcachedCache backend is used for Memcached caching settings pyCACHES default BACKEND django core cache backends memcached MemcachedCache LOCATION Here are some pros and cons of using Memcache Pros Highly scalable and efficient caching solution Distributed caching allows for caching data across multiple servers Optimized for high read performance Cons Requires a separate Memcached server setup and configuration Data is not persistent and can be lost if the Memcached server restarts Cache backend with RedisDjango also supports using Redis as a cache backend Redis is an in memory data structure store that can function as a cache server It offers advanced cachingfeatures and can be used to store various types of data It is suitable for scenarios that require advanced caching capabilities such as caching session data real time data or caching across multiple applications or services It is a good choice when you need a highly flexible and feature rich caching solution The django redis cache RedisCache backend is used for Redis caching settings pyCACHES default BACKEND django redis cache RedisCache LOCATION redis OPTIONS CLIENT CLASS django redis client DefaultClient Here are some pros and cons of using Redis caching Pros Versatile caching solution with support for various data types and advancedcaching features Persistence of data even after Redis server restarts Distributed caching capability for scalability Cons Requires a separate Redis server setup and configuration Slightly slower than in memory caching due to network overhead Custom cache backendsDjango allows developers to create custom cache backends tailored to specific caching requirements By implementing a custom cache backend developers can integrate Django with other caching systems or implement unique caching strategies For implementing custom cache backends you can create a custom cache backend class by inherting BaseCache and implementing the required cache methods add get etc Here s an example myapp cache backends pyfrom django core cache backends base import BaseCacheclass MyCustomCacheBackend BaseCache def init self location params super init params Custom initialization logic def add self key value timeout None version None Custom cache logic pass def get self key default None version None Custom cache logic pass Override other cache methods as needed settings pyCACHES default BACKEND myapp cache backends MyCustomCacheBackend LOCATION custom cache location Here are some pros and cons of using a custom cache backend Pros Flexibility to implement custom caching logic tailored to specificrequirements Can integrate with external caching systems or implement unique cachingstrategies Cons Requires additional development effort to implement and maintain May not benefit from the optimizations and community support available withbuilt in cache backends Now that we have discussed the different types of cache backends let s dive a bit more into cache key generation Understanding cache keys and how to generate themCache keys are unique identifiers that determine the storage and retrieval of cached data They play a crucial role in the caching process as they enable efficient lookup and retrieval of cached content Understanding how to generate cache keys correctly is essential for effective caching in Django In Django cache keys can be generated by combining various factors relevant to the data being cached Here are some common considerations for cache key generation Identify unique factors Start by identifying the factors that make the cached data unique and distinguish it from other data These factors can include parameters variables or identifiers that affect the data being cached For example in a view that retrieves user specific data the user s ID or username would be a unique factor Avoid collisions Ensure that the cache keys you generate do not collide with each other Collisions occur when different data shares the same cache key leading to incorrect results To prevent collisions include all relevant factors that uniquely identify the data in the cache key String concatenation One common approach is to concatenate the unique factors to generate the cache key You can use string concatenation or interpolation to combine the factors into a single string It s essential to ensure consistent ordering and formatting of the factors to generate the same cache key for the same data consistently Hashing If the factors for cache key generation are complex or contain sensitive information you can use a hashing function to generate a unique hash based cache key The hash function should produce a consistent hash value for the same input ensuring that the same data generates the same cache key consistently Normalize input Normalize any inputs that contribute to the cache key For example convert strings to lowercase remove leading trailing whitespaces or format numbers consistently Normalizing the input helps to prevent different variations of the same data from generating different cache keys Versioning If you anticipate making changes to the structure of the cached data or the cache key generation logic consider incorporating versioning into the cache key By including a version number in the cache key you can easily invalidate the cache when the structure or generation logic changes ensuring that the updated data is retrieved Custom cache key generation In some cases you may need to implement custom logic for cache key generation Django provides the make template fragment key function that allows you to generate cache keys based on template fragments This can be useful when caching fragments of a template that depend on specific factors Here s an example of cache key generation in Django from django core cache import cachedef get user data user id cache key f user data user id cached data cache get cache key if cached data is None Data not found in cache retrieve from the database data fetch data from database user id Store data in cache with the generated cache key cache set cache key data return data return cached dataBy incorporating a version number or timestamp into your cache keys you can easily invalidate the cache by updating the version Whenever you want to invalidate the cache simply update the version number and the new cache key will be different from the previous one def get user data user id cache key f user data user id cache get user data version user data cache get cache key if user data is None Retrieve user data from the source e g database user data get user data from source user id Cache the user data with the updated cache key cache set cache key user data return user datadef update user data user id Update the user data in the source e g database update user data in source user id Update the cache key version cache set user data version cache get user data version Hence by carefully generating cache keys taking into account the unique factors avoiding collisions and incorporating normalization and versioning when necessary you can ensure accurate and efficient caching in your Django applications Common caching patterns in DjangoWhen implementing caching in Django there are several common caching patterns that can be used based on the specific requirements of your application Here are three common caching patterns cache per view cache per user and cache per site Cache per ViewThis pattern involves caching the entire rendered output of a specific view It is useful when the content of a view doesn t change frequently and can be served directly from the cache Django provides a built in decorator cache page that can be applied to a view function or class based view to enable caching for that specific view For example from django views decorators cache import cache page cache page Cache for minutesdef my view request View logicIn the above example the cache page decorator is used to cache the my view function for a duration of minutes Subsequent requests within that timeframe will be served directly from the cache bypassing the view execution Cache per UserThis pattern involves caching data specific to each user It can be useful when you have user specific content that remains relatively static or can be reused across multiple requests The cache keys can be generated based on unique identifiers like the user s ID or username For example from django core cache import cachedef get user data user id cache key f user data user id cached data cache get cache key if cached data is None Data not found in cache retrieve from the database data fetch data from database user id Store data in cache with the generated cache key cache set cache key data return data return cached dataIn the above example the get user data function fetches user specific data from the cache based on the user id If the data is not found in the cache it is fetched from the database and stored in the cache with the generated cachekey Cache per SiteThis pattern involves caching data that is shared across the entire site or application regardless of the user It can be useful for static content configuration settings or frequently accessed data that is common acrossmultiple requests You can cache such data using a cache key that represents the site or application level For example from django core cache import cachedef get site settings cache key site settings cached settings cache get cache key if cached settings is None Settings not found in cache retrieve from the database or configuration settings fetch settings from database Store settings in cache with the generated cache key cache set cache key settings return settings return cached settingsIn the above example the get site settings function retrieves site settings from the cache using the cache key site settings If the settings are not found in the cache they are fetched from the database or configuration andstored in the cache These caching patterns can be combined or adapted based on your specific application s needs By utilizing caching effectively you can significantly improve the performance and scalability of your Django application However simply implementing caching is not enough it s essential to continuously monitor and optimize cache performance to reap its full benefits So let s dive into the world of cache monitoring and optimization and unlock the full potential of caching in your Django application Monitoring and optimizing cache performanceMonitoring and optimizing cache performance is crucial for ensuring the efficient utilization of caching mechanisms and maximizing the performance of your application Here are some tools and techniques you can use for cache monitoring analysis and optimization Cache Backend Specific Monitoring ToolsMany cache backends such as Memcached and Redis provide their own monitoring tools These tools allow you to monitor cache metrics track cache usage and analyze cache performance specific to the chosen cache backend Some popular tools include Memcached libMemcached MemCachierRedis Redis CLI RedisInsight Redis CommanderThese tools provide insights into cache statistics memory usage hit rate miss rate and other relevant metrics helping you monitor and optimize cache performance Application Performance Monitoring APM ToolsAPM tools like Better Stack New Relic Datadog or AppDynamics provide comprehensive monitoring and profiling capabilities for your application including cache performance analysis These tools can track cache hits misses response times and other performance related metrics They also offer features like distributed tracing which can help identify cache related issues in complex application architectures Django Debug ToolbarThe Django Debug Toolbar is a powerful tool for monitoring and analyzing various aspects of your Django application including cache usage It provides a panel that displays cache related information such as cache hits misses and cache keys used during a request response cycle By installing and configuring the Debug Toolbar you can gain insights into cache performance on a per request basis aiding in cache optimization Custom Logging and InstrumentationIncorporate logging statements and custom instrumentation in your code to track cache usage cache hits cache misses and cache related operations By logging cache related events you can analyze the behavior of your cache implementation identify performance bottlenecks and fine tune cache strategies accordingly You can use Python s built in logging module or third party logging solutions for this purpose Load Testing and ProfilingLoad testing tools like Apache JMeter or Locust can help simulate high traffic scenarios and measure cache performance under heavy load By load testing your application with different cache configurations and analyzing the results you can identify cache related performance issues such as cache contention or cache expiration problems Profiling tools like cProfile or Django s built in profiling middleware can help identify cache related performance bottlenecks within specific code segments Cache Tuning and ConfigurationReview and optimize cache configuration parameters such as cache size eviction policies and expiration times Adjust these settings based on the characteristics of your application data access patterns and memory constraints Experiment with different cache configurations and monitor theimpact on cache performance to find the optimal setup for your application Regular Performance Monitoring and BenchmarkingImplement regular performance monitoring and benchmarking to track cache performance over time Continuously monitor cache hit rates cache eviction rates and response times to identify any degradation or improvements in cache performance Use benchmarking tools to compare different cache configurations and evaluate the impact on overall application performance These techniques will help you identify and resolve cache related issues leading to improved application speed scalability and user experience Caching best practicesImproper caching implementation can lead to unexpected issues and degrade the overall user experience To ensure effective caching and avoid common pitfalls here are some best practices tips and tricks Identify the Right Cacheable ContentNot all data is suitable for caching Identify the parts of your application that can benefit from caching such as static content database query results or expensive computations Caching irrelevant or frequently changing data canresult in stale or incorrect responses For example in an e commerce application you can cache the product catalog pages or frequently accessed product details By caching these parts you reduce database queries and speed up page rendering for subsequent requests Use Granular CachingRather than caching entire pages consider caching smaller components or fragments of your content This allows more flexibility and reduces cache invalidation needs Utilize template fragment caching or HTTP caching headers to cache specific parts of your views For example in a news website instead of caching entire article pages you can cache individual components such as the headline body or related articles This allows you to update specific sections of the page without invalidating the entire cache Set Appropriate Cache ExpirationDetermine the optimal expiration time for your cached data It should be long enough to benefit from caching but short enough to avoid serving outdated content Consider the volatility of the data and set expiration times accordingly For example consider a weather application that fetches weather data from an API Since weather conditions can change frequently it s important to set a shorter expiration time for the weather data cache such as minutes Thisensures users receive up to date weather information Implement Cache InvalidationEstablish mechanisms to invalidate the cache when the underlying data changes This ensures that users always receive up to date information Use cache keys cache versioning or signals to trigger cache invalidation when relevant data is updated For example in a social media application when a user posts a new comment on a post you can invalidate the cache for that particular post s comments section This ensures that subsequent requests display the updated comments without relying on the cached version Consider Varying Cache KeysIf your application serves different content based on user specific factors e g user roles permissions or personalized settings consider incorporating those factors into the cache key This allows caching personalized content without mixing user specific data For example you have an e learning platform where users have different course enrollment statuses To cache personalized course progress you can include the user s ID or enrollment status in the cache key ensuring each user receives their respective cached data Use Cache Control HeadersLeverage the Cache Control HTTP headers to control caching behavior Set appropriate values for max age must revalidate or no cache directives to define caching rules at the client side This ensures consistent cache behavior across different user agents Monitor and Analyze Cache UsageRegularly monitor cache hit rates miss rates and cache size to evaluate the effectiveness of your caching strategy Analyze cache statistics to identify areas for improvement such as increasing cache hit rates or reducing cache misses Consider Caching MechanismsDjango provides various caching backends including in memory caches e g Memcached and persistent caches e g Redis Choose the caching mechanism based on your application s requirements scalability needs and available infrastructure Test and BenchmarkPerform load testing and benchmarking to evaluate cache performance under different scenarios Identify potential bottlenecks assess cache efficiency and make necessary adjustments to cache configurations based on the results Regularly Review and OptimizeCache performance can change over time as your application evolves Regularly review and optimize your caching strategy based on user behavior changing data access patterns and performance monitoring ConclusionIn this comprehensive guide we have explored the intricacies of caching in Django including its various components types best practices and optimization techniques By understanding and implementing caching effectively you can significantly enhance your Django application resulting in improved response times reduced server load and enhanced user experiences Mastering the art of caching empowers you to unlock the full potential of your applications By following caching best practices leveraging appropriate caching mechanisms and continuously monitoring and optimizing cache performance you can ensure that your applications are highly performant andscalable It is important to note that caching is not a one time setup but an ongoing process that requires regular review testing and optimization As application requirements evolve it is crucial to stay vigilant adapt caching strategies accordingly and consistently optimize cache configurations to achieve optimal performance in Django projects By embracing caching as an integral part of the development process and keeping up with the evolving needs of your application you can harness the power of caching to deliver exceptional performance and provide users with seamless and responsive experiences |
2023-07-23 11:25:44 |
海外TECH |
DEV Community |
AWS VPC: Private Subnets Increase Risk |
https://dev.to/wparad/aws-vpc-private-subnets-increase-risk-4147
|
AWS VPC Private Subnets Increase Risk Should I put my service in the private subnet or the public one This age old question has been around since almost the beginning of AWS And it seems like a simple question with a simple answer However the real truth is there is no answer Private SubnetsBefore we get to it let s review what is even a private subnet AWS allows you to create virtual networks these networks are called VPCs In each VPC you can isolate compute resources The logical isolation inside a VPC is known as a subnet Subnets can connect to internet and the internet can be connected to them Okay so where does the private part come in There is actually no such thing as a private subnet in AWS or public for that matter It s a completely made up term that doesn t exist in AWS because there is only a subnet However frequently the term is associated with the following criteria Has publicly IP addressesInternet can be routed the IP addressesThe IP addresses can be routed to the internetIf you are missing any of those then technically it isn t a public subnet does that make it a private one Who knows The Public Subnet RiskNow that we know what a private subnet is let s talk about the risk of using a public one for your product service application technology What risks are associated with using a public subnet The problem with this question is that risk is not binary It could be concluded that there is more of a risk that something bad will happen than if it is in a private subnet Is that true though The VPC and Subnets offer each their own security appliances to protect resources and the only difference is internet connectivity Which means the question becomes Is a subnet connected to the internet a risk Maybe See just because there is a route to your subnet compute resources doesn t mean that you ve opened the firewall of the instance or the Security Group to allow connections to the machine But you could The question isn t Is my instance or database directly connected to the internet a risk Because that has a simple answer YESHaving your subnet connected to the internet doesn t offer attackers any additional means to get to your compute resources And further often in the case of using a private subnet you ve added a NAT and internet gateway connecting through your public subnet So in essence you ve reconnected them to internet albeit a different way Actually the truth here is if your technology is sufficiently protected it doesn t matter what setup you use What does Sufficient protection look like The problem with this conclusion is that we can t list out every possible attack vector because there are an infinite number of them And further our technology isn t deployed in isolation We deploy new software packages application versions fixes and we also may change the infrastructure of the VPC over time That means we can t just evaluate the RISK at a static point in time So the risks are actually whatever you conceive to be a risk at a certain point in time No one can tell you what your risks are They can tell you what their risks are and then point you to how they solved them but even then it won t include everything Assume you get a list of risks We know that isn t a full list because the list of risks is infinite And based on that list you go with a public subnet One day will you regret when something happens anyway maybe If only I had went with a private subnet What happens if you learn about a new risk you aren t willing to deal with will you switch back to a private subnet before something happens Further you need to understand that just because there is a risk doesn t mean it should be considered It is risky to go outside because you might get hit by a car Cars are dangerous But that could be a risk you ignore Is it a risk yesWhen is it a risk when outsideDo you care no at least no in this case So what do we do Well we know that we can t have a full list of the risks and we know that we might not even care about the risks on the list but the most dangerous thing is leaving the conversation thinking you have all the risks because someone on the internet told you a list You might be missing the most critical aspect that wasn t on the list That means you need to develop the list yourself what are your risks This is known as threat modeling You come up with actual risks that you care about than then you evaluate whether the solution you have is the right one Threat ModelingIt may seem like it isn t important to do this but I ll explain why it is Let s look at an example We put our instances in the private subnet but then we realize that we need to add an ALB to grant access from the internet to your service AND we also realize that we need to grant access from your service to the internet The reasons are usually something like well this is a product users need to access it and it needs to get updates or make calls to other services to perform its functions As part of making this changes likely in terraform you update the security groups attached to the instance And in doing so you accidentally remove access to the database This accident can easily be happen since database access is protected by a security group and configured in terraform therefore is something you can break by changing infrastructure This is an example of Using a private subnet increases risk A common issue is one of the form that increased complications increase risk I m going to repeat that again Complications RiskThe more complicated your systems are then the more likely you will be to accidentally do something wrong At the same time the less safeguards you have in place the more likely you are to accidentally do something wrong So in reality we don t know if our infrastructure choices have increased or decreased risk Each change can increase or decrease the risk and we won t know which way it goes The RisksThat s because risk is a spectrum how risky something is can increase or decrease by changes to the circumstance We don t have perfect knowledge of the circumstance as well as the perfect knowledge of all the risks The reason for this is because there are an infinite number of risks and an infinite subset of the risks change by altering the aspects of the situation When switching from a public subnet to a private subnet some of the risks decrease in riskiness and others actually increase as we have seen That means that a private subnet is both more secure and less secure at the same time And Security isn t the only kind of risk that we can talk about So the only valuable conversation to have is to talk about what risks do you care about and then decide if those risks are higher or lower with a private subnet ConclusionThere was recently a great talk about our collective inability to Measure Security Outcomes I bring up this topic as it is one that frequently occurs and seams to not have any concrete and satisfiable answer yet It s also incredibly relevant for my teams in building Authress which focuses on providing user identity and access control We care about making Authress as secure as possible to protect our customers users So you can imagine questions about security are always present in our conversations And if you are interested in this topic and others you can join our Security Community |
2023-07-23 11:23:52 |
海外TECH |
DEV Community |
Introducing StackUp - A Single Application to Spin Up Your Entire Dev Stack |
https://dev.to/patinthehat/introducing-stackup-a-single-application-to-spin-up-your-entire-dev-stack-1mel
|
Introducing StackUp A Single Application to Spin Up Your Entire Dev Stack Introducing StackUp A Single Application to Spin Up Your Entire Dev StackAs developers we often find ourselves working with complex development environments that require a series of steps to set up and tear down This process can be time consuming and prone to errors especially when working in a team where consistency is key Enter StackUp a tool that automates the process of spinning up and shutting down your entire development stack What is StackUp StackUp is a tool designed to simplify the process of setting up and tearing down complex development environments It allows you to define a series of steps that execute in order on startup and shutdown as well as a list of server processes that should be started Additionally StackUp runs an event loop while the server processes are running allowing you to run tasks on a cron schedule One of the key features of StackUp is its ability to automate routine tasks With a simple configuration you can define a sequence of tasks that your project requires such as starting containers running database migrations or seeding data This automation not only saves you time but also ensures consistency across your development environment StackUp also includes a robust precondition system Before doing anything checks can be performed to ensure everything is set up correctly This feature helps prevent common issues that occur when the environment is not properly configured Running StackUpTo run StackUp simply run the binary in a directory containing a stackup yaml configuration file stackupOr specify a configuration filename stackup config stackup dev yaml ConfigurationThe application is configured using a YAML file named stackup yaml containing five sections preconditions tasks startup shutdown and scheduler PreconditionsThe preconditions section of the configuration file is used to specify a list of conditions that must be met before the tasks and servers can run Each precondition is defined by a name and a check The name is a human readable description of the precondition and the check is a javascript expression that returns a boolean value indicating whether the precondition is met TasksThe tasks section of the configuration file is used to specify all tasks that can be run during startup shutdown as a server or as a scheduled task Startup amp ShutdownThe startup and shutdown sections of the configuration define the tasks that should be run synchronously during either startup or shutdown The values listed must match a defined task id ServersThe servers section of the configuration file is used to specify a list of tasks that the application should start as server processes The values listed must match a defined task id SchedulerThe scheduler section of the configuration file is used to specify a list of tasks that the application should run on a schedule Each entry should contain a task id and a cron expression The task value must be equal to the id of a task that has been defined Example ConfigurationsStackUp provides several example configurations to get you started Let s take a look at an example configuration for a Laravel project name my stackdescription laravel application stackversion preconditions name dependencies are installed check binaryExists php name project is a laravel application check exists getCwd artisan startup task start containers task run migrations fresh task run migrations no seedshutdown task stop containersservers task horizon queue task httpdscheduler task artisan scheduler cron tasks name spin up containers id start containers if exists getCwd docker compose yml command docker compose up d silent true name run migrations rebuild db id run migrations fresh if hasFlag seed command php artisan migrate fresh seed name run migrations no seeding id run migrations no seed if hasFlag seed command php artisan migrate name stop containers id stop containers if exists getCwd docker compose yml command docker compose down silent true name run artisan scheduler id artisan scheduler command php artisan schedule run name horizon queue id horizon queue command php artisan horizon platforms linux darwin name httpd id httpd command php artisan serve Available FunctionsMany of the configuration fields can be defined using a javascript expression syntax To specify an expression to be evaluated wrap the content in double braces env HOME ConclusionStackUp is a powerful tool that can significantly simplify the process of setting up and tearing down complex development environments By automating routine tasks and ensuring consistency across your development environment StackUp can save you time and help you avoid common setup errors Give it a try on your next project For more information please see the StackUp GitHub repository |
2023-07-23 11:05:17 |
Apple |
AppleInsider - Frontpage News |
Crime blotter: Police officer indicted for 2021 Apple Store punch |
https://appleinsider.com/articles/23/07/23/crime-blotter-police-officer-indicted-for-2021-apple-store-punch?utm_medium=rss
|
Crime blotter Police officer indicted for Apple Store punchIn the latest Apple Crime Blotter alleged Apple Store thieves are accused of speaking Pig Latin a sentence is set in Minneapolis thefts and an iCloud investigation leads to an officer s arrest Apple Store on Broadway in New York The latest in an occasional AppleInsider series looking at the world of Apple related crime Read more |
2023-07-23 11:39:09 |
海外ニュース |
Japan Times latest articles |
China proposes high-level talks with Tokyo, Seoul, with eye to summit |
https://www.japantimes.co.jp/news/2023/07/23/national/top-chinese-diplomat-proposes-trilateral-talks-japan-south-korea/
|
summitany |
2023-07-23 20:38:48 |
ニュース |
BBC News - Home |
Rhodes fires: Jet2 and TUI flights cancelled as British tourists in limbo |
https://www.bbc.co.uk/news/uk-66282387?at_medium=RSS&at_campaign=KARANGA
|
hotels |
2023-07-23 11:10:30 |
ニュース |
BBC News - Home |
Israel judicial reform: Netanyahu in hospital ahead of key vote |
https://www.bbc.co.uk/news/world-middle-east-66281968?at_medium=RSS&at_campaign=KARANGA
|
legal |
2023-07-23 11:52:15 |
ニュース |
BBC News - Home |
Treasury to meet bank bosses over Farage row |
https://www.bbc.co.uk/news/business-66274359?at_medium=RSS&at_campaign=KARANGA
|
political |
2023-07-23 11:32:13 |
コメント
コメントを投稿