投稿時間:2023-06-07 01:34:25 RSSフィード2023-06-07 01:00 分まとめ(38件)

カテゴリー等 サイト名等 記事タイトル・トレンドワード等 リンクURL 頻出ワード・要約等/検索ボリューム 登録日
AWS AWS Big Data Blog Set up advanced rules to validate quality of multiple datasets with AWS Glue Data Quality https://aws.amazon.com/blogs/big-data/set-up-advanced-rules-to-validate-quality-of-multiple-datasets-with-aws-glue-data-quality/ Set up advanced rules to validate quality of multiple datasets with AWS Glue Data QualityData is the lifeblood of modern businesses In today s data driven world companies rely on data to make informed decisions gain a competitive edge and provide exceptional customer experiences However not all data is created equal Poor quality data can lead to incorrect insights bad decisions and lost opportunities AWS Glue Data Quality measures and monitors the … 2023-06-06 15:59:21
AWS AWS Big Data Blog Getting started with AWS Glue Data Quality from the AWS Glue Data Catalog https://aws.amazon.com/blogs/big-data/getting-started-with-aws-glue-data-quality-from-the-aws-glue-data-catalog/ Getting started with AWS Glue Data Quality from the AWS Glue Data CatalogAWS Glue is a serverless data integration service that makes it simple to discover prepare and combine data for analytics machine learning ML and application development You can use AWS Glue to create run and monitor data integration and ETL extract transform and load pipelines and catalog your assets across multiple data stores Hundreds of … 2023-06-06 15:57:43
AWS AWS Big Data Blog Deep dive on Amazon MSK tiered storage https://aws.amazon.com/blogs/big-data/deep-dive-on-amazon-msk-tiered-storage/ Deep dive on Amazon MSK tiered storageIn the first post of the series we described some core concepts of Apache Kafka cluster sizing the best practices for optimizing the performance and the cost of your Kafka workload This post explains how the underlying infrastructure affects Kafka performance when you use Amazon Managed Streaming for Apache Kafka Amazon MSK tiered storage We … 2023-06-06 15:55:22
AWS AWS Big Data Blog How SumUp made digital analytics more accessible using AWS Glue https://aws.amazon.com/blogs/big-data/how-sumup-made-digital-analytics-more-accessible-using-aws-glue/ How SumUp made digital analytics more accessible using AWS GlueThis is a guest blog post by Mira Daniels and Sean Whitfield from SumUp SumUp is a leading global financial technology company driven by the purpose of leveling the playing field for small businesses Founded in SumUp is the financial partner for more than million small merchants in over markets worldwide helping … 2023-06-06 15:50:55
AWS AWS Media Blog How Media.Monks avoided carbon emissions using AWS for remote production https://aws.amazon.com/blogs/media/how-media-monks-avoided-carbon-emissions-using-aws-for-remote-production/ How Media Monks avoided carbon emissions using AWS for remote productionThis blog is authored by Lewis Smithingham SVP of Innovation Media Monks and Wes Hovanec Virtual Studios Lead Media Monks Sustainability featured more prominently than ever at this year s National Association of Broadcasters Show NAB has been described as “the world s largest annual convention for broadcasters and the broader media entertainment and technology industries Responding to growing … 2023-06-06 15:54:04
AWS lambdaタグが付けられた新着投稿 - Qiita Lambda Payload version 2.0 のサンプル https://qiita.com/jobscale/items/f135a6b00df0caf4e8c9 versionroutekeypostmyendp 2023-06-07 00:48:26
python Pythonタグが付けられた新着投稿 - Qiita ABC 304 備忘録 https://qiita.com/mae-commits/items/befa30d69e7f7e85a4b5 諸事情 2023-06-07 00:43:32
js JavaScriptタグが付けられた新着投稿 - Qiita LINEBotとobnizを繋いでお風呂をためたらいい感じにととのいそう https://qiita.com/hiroaki0724/items/7579b987dbeafc2bd28f linebot 2023-06-07 01:00:04
海外TECH DEV Community Top Tools for CyberSecurity in 2023 https://dev.to/scofieldidehen/top-tools-for-cybersecurity-in-2023-13og Top Tools for CyberSecurity in Cybersecurity has become one of the most common skills with a growing influence in the coming years Many wonder what tools will be used and which will go extinct as more advanced tools are developed This article will explore the tools that CYber Hunters like myself use daily as a Pentetraion and Cybersecurity specialist Datashare and PinpointDatashare and Pinpoint are essential tools for eDiscovery which involves identifying collecting validating and analyzing digital evidence Simply put it s about working with electronic materials related to investigations These tools allow sharing of case materials online with external users providing end to end analytics and search capabilities They are indispensable for conducting investigations effectively Archivarius and DtSearchArchivarius and DtSearch are designed to handle large amounts of textual information They can read files and extract relevant information such as nicknames email addresses phone numbers and hyperlinks These tools enable searching through vast amounts of data using advanced search operators They are also useful for searching partial data e g searching by email addresses or partially obscured phone numbers Additionally they can be used to create non relational databases quickly VenatorVenator is a versatile OSINT Open Source Intelligence browser based on Librewolf a privacy and security focused fork of Firefox This browser is specifically tailored for use in Eastern Europe which sets it apart from similar tools like Oryon and CSI It provides additional panels for quick access to search engines and specialized OSINT resources categorized by research topics such as telephone numbers email addresses nicknames and websites SpreadsheetsDon t underestimate the power of Google Sheets It is a versatile tool that allows you to prototype almost any data collection service using open sources By leveraging the Google ecosystem including search operators known as dorks Google Sheets becomes a powerful OSINT tool I have used it to develop a media monitoring system for specific queries an identification system for Telegram channels and checking email addresses and user nicknames Breadcrumbs and ShardBreadcrumbs is an analytical platform for exploring the blockchain of different cryptocurrencies It facilitates investigations monitoring tracking and sharing of up to date information about blockchain transactions It also helps identify individual crypto wallets A Russian alternative called Shard was introduced last year offering similar functionality at no additional cost StartStart is a popular bookmark manager among OSINT experts While its primary purpose is to organize useful resources it can also function as a dashboard for network monitoring collecting RSS feeds and even serve as an effective honeypot a trap for cyber investigations by placing tracking pixels on its pages Maltego and SpiderFootMaltego and SpiderFoot are comprehensive software systems for OSINT that incorporate other services and databases through APIs These tools enable the connection of custom services and data Despite their outdated interfaces lacking features like file forensics timelines and cartographic data presentation they allow users to build and share investigation graphs Depending on the modules integrated into these programs they can support various investigations Dork Search Advangle and DorkGeniusDork Search is a tool for automating and suggesting advanced search operators also known as Google Dorks It saves time by manually eliminating the need to search for dorks through trial and error In I discovered an alternative service called Advangle which proved equally effective Additionally I recommend trying DorkGenius which employs AI to generate advanced search queries for Google Bing and DuckDuckGo I currently use all three products in my work CanaryTokens and IP LoggerCanaryTokens and IP Logger are popular loggers that provide information about internet users connections and devices These services commonly create basic honeypots such as hyperlinks images emails documents invisible pixels or even fake credit cards Over the past year both services have significantly enhanced their functionality They now allow for the collection of comprehensive digital fingerprints of users which greatly enhances their effectiveness in crime investigations and active OSINT activities Universal Search and Yandex AudienceUniversal Search is a powerful tool that automates various OSINT methods It simplifies and streamlines the process so effectively that even newcomers using it in their work can appear as experienced professionals On the other hand Yandex Audience is a promising domestic tool for ADINT Advertising Identifier Intelligence ADINT enables the creation of social graphs and tracking of individuals using their email addresses phone numbers MAC addresses and iOS and Android device identifiers ConclusionThese tools have become indispensable in my daily work as a crime investigator They allow me to handle digital evidence efficiently search through large amounts of information explore blockchain transactions conduct OSINT investigations and automate various tasks With the constant advancements in technology and the development of new tools the field of crime investigation continues to evolve and these tools play a crucial role in staying ahead of the game If you discover this publish thrilling discover extra thrilling posts like this on Learnhub Blog we write a lot of tech related topics from Cloud computing to Frontend Dev Cybersecurity AI and Blockchain Take a look at How to Build Offline Web Applications 2023-06-06 15:48:09
海外TECH DEV Community Ensuring Business Continuity: A Guide to Choosing the Right Disaster Recovery Strategy on AWS https://dev.to/brandondamue/ensuring-business-continuity-a-guide-to-choosing-the-right-disaster-recovery-strategy-on-aws-1eoi Ensuring Business Continuity A Guide to Choosing the Right Disaster Recovery Strategy on AWSIn today s digital landscape ensuring the availability and resilience of our systems and applications is of utmost importance Unforeseen events such as natural disasters hardware failures or human errors can hurt business continuity and lead to significant downtime data loss and financial loss That s where having a robust disaster recovery strategy becomes crucial In this article we will explore the various considerations and options available for implementing a disaster recovery plan for your solutions built on AWS We ll delve into the key factors to consider different recovery strategies and how AWS provides a comprehensive suite of tools and services to help you design a resilient and fault tolerant architecture Whether you re running critical business applications managing sensitive customer data or hosting mission critical services choosing the right disaster recovery strategy can safeguard your business continuity and minimize the impact of disruption When implementing a disaster recovery strategy on AWS there are two key factors you have to consider Considering these two factors allows you to align your disaster recovery strategy with your business requirements and define the necessary processes technologies and AWS services to achieve the desired recovery objectives These two factors are Recovery Time Objective and Recovery Point Objective Let s see what each of them means in brief detail Recovery Time Objective RTO ーrefers to the maximum tolerable downtime for your applications and services during a disaster RTO defines the maximum amount of time your applications and services can remain unavailable before it starts impacting your business operations negatively Think of it as a deadline or a target you set for yourself For example if you have an RTO of hours it means you aim to recover your systems and have them up and running within that time frame The shorter the RTO the faster you can bounce back from disruption Recovery Point Objective RPO ーThis refers to the maximum acceptable data loss that a system can incur in the event of a disaster If something unexpected happens the RPO tells you the furthest point you can go back to in terms of saved work For example if your RPO is set to hour it means you can recover your project to a version saved within the past hour Deciding on the right RPO for your business involves considering factors like how often your data changes how important it is and how much you re willing to invest to minimize data loss Some businesses like banks or healthcare providers need very low RPOs to ensure minimal data loss Now onto the various disaster recovery strategies that are available on AWS There are several options to consider based on your specific requirements Here are some of the common strategies Backup and restoreThis is a suitable approach for mitigating data loss or corruption It involves regularly creating backups of your data and applications and storing them in a separate location This strategy ensures that you have copies of your critical data and systems that can be used for recovery in the event of a disaster or data loss To implement this strategy on AWS you can leverage services like Amazon S and Glacier S provides highly durable and scalable object storage where you can store your backups It offers versioning and lifecycle management features that allow you to automate the backup process and retain backups for the required duration The image below illustrates a sample backup and restore architecture Pilot LightWith the pilot light strategy you have a scaled down version of your infrastructure running in the cloud This setup includes only the essential components needed for recovery such as critical applications databases and data storage The rest of the infrastructure remains idle until a disaster occurs One real world scenario where the Pilot Light strategy is commonly used is in e commerce businesses Let s consider an online retail store that experiences high traffic during holiday seasons or special promotions To ensure uninterrupted service the retailer can maintain a Pilot Light environment in AWS This includes a minimal set of web servers application servers and a synchronized database During normal operations the Pilot Light environment runs at a fraction of the full scale infrastructure resulting in lower costs However when disaster strikes the retailer can rapidly scale up the infrastructure by launching additional instances and redirecting traffic to the AWS environment This allows the retailer to handle the increased load and maintain a seamless customer experience in case of on premises failures Pilot Light provides a balance between cost efficiency and high availability ensuring that critical systems can be quickly restored in the event of a disaster Warm StandbyThe Warm Standby approach involves maintaining a partially provisioned environment that is ready to take over in case of a disaster It is a step above the Pilot Light strategy and provides a faster recovery time compared to starting from scratch In a Warm Standby setup a subset of infrastructure components is pre provisioned and running including virtual machines databases and storage These resources are kept up to date and synchronized with the production environment but they are not actively serving traffic They act as a warm backup ready to be activated when needed When a disaster occurs the Warm Standby environment can be quickly scaled up by launching additional instances and activating the necessary services Traffic can be rerouted to the standby environment allowing it to handle the workload and ensure business continuity A real world scenario where the Warm Standby strategy is commonly used is in financial institutions that require continuous availability of critical systems For example a bank s online banking platform may have a Warm Standby environment in AWS The standby environment would include replicated databases pre configured virtual machines and data storage In the event of a disaster such as a data centre outage or hardware failure the warm standby environment can be activated ensuring that customers can continue to access their accounts and perform transactions effectively Multi Site Hot SiteIt involves maintaining a fully operational and synchronized replica of an application or infrastructure across multiple AWS regions It is designed to provide high availability and resilience in the event of a disaster that affects the primary site In a Multi Site disaster recovery setup the application s resources including servers databases storage and networking are replicated and deployed across multiple geographically dispersed locations These locations can be different AWS regions or different AZs within a region The primary site handles the normal production workload while the secondary site serves as a standby environment ready to take over in case of a disaster This disaster recovery strategy offers several benefits including reduced RTO and RPO improved application availability and geographical redundancy It allows for failover to the secondary site in the event of a disaster ensuring that business operations can continue with minimal disruption Final ThoughtsChoosing the right disaster recovery strategy for your solutions built on AWS is a crucial decision that can significantly impact your business s resilience and continuity By carefully considering factors such as Recovery Time Objective Recovery Point Objective cost complexity and the specific needs of your applications and data you can make an informed choice that aligns with your business goals Whether you opt for backup and restore Pilot Light Warm Standby or Multi Site remember that no single strategy fits all scenarios Regular testing monitoring and refinement of your disaster recovery plans are essential to ensure their effectiveness With AWS s robust infrastructure and a well designed disaster recovery strategy you can safeguard your business against unforeseen events and minimize disruptions enabling you to recover quickly and continue serving your customers efficiently 2023-06-06 15:41:56
海外TECH DEV Community Unlocking the Power of API Pagination: Best Practices and Strategies https://dev.to/pragativerma18/unlocking-the-power-of-api-pagination-best-practices-and-strategies-4b49 Unlocking the Power of API Pagination Best Practices and StrategiesIn the modern application development and data integration world APIs Application Programming Interfaces serve as the backbone for connecting various systems and enabling seamless data exchange When working with APIs that return large datasets efficient data retrieval becomes crucial for optimal performance and a smooth user experience This is where API pagination comes into play In this article we will discuss the best practices for implementing API pagination ensuring that developers can handle large datasets effectively and deliver data in a manageable and efficient manner But before we jump into the best practices let s go over what is API pagination and the standard pagination techniques used in the present day Note This article caters to developers with prior knowledge of APIs and experience in building or consuming them While the best practices and concepts discussed are applicable across different programming languages we will primarily use Python for illustrative examples throughout the article Understanding API PaginationAPI pagination refers to a technique used in API design and development to retrieve large data sets in a structured and manageable manner When an API endpoint returns a large amount of data pagination allows the data to be divided into smaller more manageable chunks or pages Each page contains a limited number of records or entries The API consumer or client can then request subsequent pages to retrieve additional data until the entire dataset has been retrieved Pagination typically involves the use of parameters such as offset and limit or cursor based tokens to control the size and position of the data subset to be retrieved These parameters determine the starting point and the number of records to include on each page By implementing API pagination developers as well as consumers can have the following advantages Improved Performance Retrieving and processing smaller chunks of data reduces the response time and improves the overall efficiency of API calls It minimizes the load on servers network bandwidth and client side applications Reduced Resource Usage Since pagination retrieves data in smaller subsets it reduces the amount of memory processing power and bandwidth required on both the server and the client side This efficient resource utilization can lead to cost savings and improved scalability Enhanced User Experience Paginated APIs provide a better user experience by delivering data in manageable portions Users can navigate through the data incrementally accessing specific pages or requesting more data as needed This approach enables smoother interactions faster rendering of results and easier navigation through large datasets Efficient Data Transfer With pagination only the necessary data is transferred over the network reducing the amount of data transferred and improving network efficiency Scalability and Flexibility Pagination allows APIs to handle large datasets without overwhelming system resources It provides a scalable solution for working with ever growing data volumes and enables efficient data retrieval across different use cases and devices Error Handling With pagination error handling becomes more manageable If an error occurs during data retrieval only the affected page needs to be reloaded or processed rather than reloading the entire dataset This helps isolate and address errors more effectively ensuring smoother error recovery and system stability Some common examples of paginated APIs are as follows Platforms like Twitter Facebook and Instagram often employ paginated APIs to retrieve posts comments or user profiles Online marketplaces such as Amazon eBay and Etsy utilize paginated APIs to retrieve product listings search results or user reviews Banking or payment service providers often provide paginated APIs for retrieving transaction history account statements or customer data Job search platforms like Indeed or LinkedIn Jobs offer paginated APIs for retrieving job listings based on various criteria such as location industry or keywords Common API Pagination TechniquesThere are several common API pagination techniques that developers employ to implement efficient data retrieval Here are a few commonly used techniques Offset and Limit PaginationThis technique involves using two parameters offset and limit The offset parameter determines the starting point or position in the dataset while the limit parameter specifies the maximum number of records to include on each page For example an API request could include parameters like offset and limit to retrieve the first records GET api posts offset amp limit Cursor Based PaginationInstead of relying on numeric offsets cursor based pagination uses a unique identifier or token to mark the position in the dataset The API consumer includes the cursor value in subsequent requests to fetch the next page of data This approach ensures stability when new data is added or existing data is modified The cursor can be based on various criteria such as a timestamp a primary key or an encoded representation of the record For example GET api posts cursor eyJpZCIMXIn the above API request the cursor value eyJpZCIMX represents the identifier of the last fetched record This request retrieves the next page of posts after that specific cursor Page Based PaginationPage based pagination involves using a page parameter to specify the desired page number The API consumer requests a specific page of data and the API responds with the corresponding page typically along with metadata such as the total number of pages or total record count This technique simplifies navigation and is often combined with other parameters like limit to determine the number of records per page For example GET api posts page amp limit In this API request we are requesting the second page where each page contains posts Time Based PaginationIn scenarios where data has a temporal aspect time based pagination can be useful It involves using time related parameters such as start time and end time to specify a time range for retrieving data This technique enables fetching data in chronological or reverse chronological order allowing for efficient retrieval of recent or historical data For example GET api events start time T Z amp end time T ZHere this request fetches events that occurred between January and January based on their timestamp Keyset PaginationKeyset pagination relies on sorting and using a unique attribute or key in the dataset to determine the starting point for retrieving the next page For example if the data is sorted by a timestamp or an identifier the API consumer includes the last seen timestamp or identifier as a parameter to fetch the next set of records This technique ensures efficient retrieval of subsequent pages without duplication or missing records To further simplify this consider an API request GET api products last key XYZ Here XYZ represents the last seen key or identifier The request retrieves the next set of products after the one with the key XYZ Now that we have learned about the common API pagination techniques we are all ready to learn about the best practices to be followed while implementation of paginated APIs Best Practices for API PaginationWhen implementing API pagination in Python there are several best practices to follow Let s discuss these in detail Use a Common Naming Convention for Pagination Parameters Adopt a consistent naming convention for pagination parameters such as offset and limit or page and size This makes it easier for API consumers to understand and use your pagination system Always include Pagination Metadata in API Responses Provide metadata in the API responses to convey additional information about the pagination This can include the total number of records the current page the number of pages and links to the next and previous pages This metadata helps API consumers navigate through the paginated data more effectively For example here s how the response of a paginated API should look like data id title Post content Lorem ipsum dolor sit amet category Technology id title Post content Praesent fermentum orci in ipsum category Sports id title Post content Vestibulum ante ipsum primis in faucibus category Fashion pagination total records current page total pages next page prev page null Determine an Appropriate Page Size Select an optimal page size that balances the amount of data returned per page A smaller page size reduces the response payload and improves performance while a larger page size reduces the number of requests required Determining an appropriate page size for a paginated API involves considering various factors such as the nature of the data performance considerations and user experience Here are some guidelines to help you determine the optimal page size Understand the Data Characteristics Consider the size and complexity of the individual records in your dataset If the records are relatively small you may be able to accommodate a larger page size without significant performance impact On the other hand if the records are large or contain complex nested structures it s advisable to keep the page size smaller to avoid excessively large response payloads Consider Network Latency and Bandwidth Take into account the typical network conditions and the potential latency or bandwidth limitations that your API consumers may encounter If users are on slower networks or have limited bandwidth a smaller page size can help reduce the overall transfer time and improve the responsiveness of your API Evaluate Performance Impact Consider the performance implications of larger page sizes While larger page sizes can reduce the number of API requests needed to retrieve a full dataset they may also increase the response time and put additional strain on server resources Measure the impact on performance and monitor the server load to strike a balance between page size and performance Consider User Experience and Usability Think about how API consumers will interact with the paginated data Larger page sizes may result in fewer pages to navigate through which can improve the user experience by reducing the number of pagination interactions However excessively large page sizes may make it challenging for users to find specific records or navigate through the data efficiently Consider the use cases and the needs of your API consumers when determining an optimal page size Provide Flexibility with Pagination Parameters Instead of enforcing a fixed page size consider allowing API consumers to specify their preferred page size as a parameter This flexibility empowers consumers to choose a page size that best suits their needs and network conditions Solicit User Feedback If possible gather feedback from API consumers to understand their preferences and requirements regarding the page size Consider conducting surveys or seeking feedback through user forums or support channels to gather insights into their expectations and any pain points they might be experiencing Implement Sorting and Filtering Options Provide sorting and filtering parameters to allow API consumers to specify the order and subset of data they require This enhances flexibility and enables users to retrieve targeted results efficiently Here s an example of how you can implement sorting and filtering options in a paginated API using Python In this example we ll use Flask a popular web framework to create the API from flask import Flask request jsonifyapp Flask name Dummy dataproducts id name Product A price category Electronics id name Product B price category Clothing id name Product C price category Electronics id name Product D price category Clothing Add more products as needed app route products methods GET def get products Pagination parameters page int request args get page per page int request args get per page Sorting options sort by request args get sort by id sort order request args get sort order asc Filtering options category request args get category min price float request args get min price max price float request args get max price float inf Apply filters filtered products filter lambda p p price gt min price and p price lt max price products if category filtered products filter lambda p p category category filtered products Apply sorting sorted products sorted filtered products key lambda p p sort by reverse sort order lower desc Paginate the results start index page per page end index start index per page paginated products sorted products start index end index return jsonify paginated products if name main app run debug True In this example we define a products endpoint that accepts various query parameters for sorting filtering and pagination Here s how you can use these parameters page The page number to retrieve default is per page The number of items per page default is sort by The field to sort the products by default is id sort order The sort order asc for ascending desc for descending default is asc category The category to filter the products by optional min price The minimum price to filter the products by default is max price The maximum price to filter the products by default is infinity Here s an example cURL command to retrieve the first page of products sorted by price in descending order curl X GET http localhost products page amp per page amp sort by price amp sort order desc Preserve Pagination Stability Ensure that the pagination remains stable and consistent between requests Newly added or deleted records should not affect the order or positioning of existing records during pagination This ensures that users can navigate through the data without encountering unexpected changes To ensure that API pagination remains stable and consistent between requests follow these guidelines Use a Stable Sorting Mechanism If you re implementing sorting in your pagination ensure that the sorting mechanism remains stable This means that when multiple records have the same value for the sorting field their relative order should not change between requests For example if you sort by the date field make sure that records with the same date always appear in the same order Avoid Changing Data Order Avoid making any changes to the order or positioning of records during pagination unless explicitly requested by the API consumer If new records are added or existing records are modified they should not disrupt the pagination order or cause existing records to shift unexpectedly Use Unique and Immutable Identifiers It s good practice to use unique and immutable identifiers for the records being paginated This ensures that even if the data changes the identifiers remain constant allowing consistent pagination It can be a primary key or a unique identifier associated with each record Handle Record Deletions Gracefully If a record is deleted between paginated requests it should not affect the pagination order or cause missing records Ensure that the deletion of a record does not leave a gap in the pagination sequence For example if record X is deleted subsequent requests should not suddenly skip to record Y without any explanation Use Deterministic Pagination Techniques Employ pagination techniques that offer deterministic results Techniques like cursor based pagination or keyset pagination where the pagination is based on specific attributes like timestamps or unique identifiers provide stability and consistency between requests Handle Edge Cases and Error Conditions Account for edge cases such as reaching the end of the dataset handling invalid or out of range page requests and gracefully handling errors Provide informative error messages and proper HTTP status codes to guide API consumers in handling pagination related issues Here are some key considerations for handling edge cases and error conditions in a paginated API Out of Range Page Requests When an API consumer requests a page that is beyond the available range it s important to handle this gracefully Return an informative error message indicating that the requested page is out of range and provide relevant metadata in the response to indicate the maximum available page number Invalid Pagination Parameters Validate the pagination parameters provided by the API consumer Check that the values are within acceptable ranges and meet any specific criteria you ve defined If the parameters are invalid return an appropriate error message with details on the issue Handling Empty Result Sets If paginated request results in an empty result set indicate this clearly in the API response Include metadata that indicates the total number of records and the fact that no records were found for the given pagination parameters This helps API consumers understand that there are no more pages or data available Server Errors and Exception Handling Handle server errors and exceptions gracefully Implement error handling mechanisms to catch and handle unexpected errors ensuring that appropriate error messages and status codes are returned to the API consumer Log any relevant error details for debugging purposes Rate Limiting and Throttling Consider implementing rate limiting and throttling mechanisms to prevent abuse or excessive API requests Enforce sensible limits to protect the API server s resources and ensure fair access for all API consumers Return specific error responses e g HTTP Too Many Requests when rate limits are exceeded Clear and Informative Error Messages Provide clear and informative error messages in the API responses to guide API consumers when errors occur Include details about the error type possible causes and suggestions for resolution if applicable This helps developers troubleshoot and address issues effectively Consistent Error Handling Approach Establish a consistent approach for error handling throughout your API Follow standard HTTP status codes and error response formats to ensure uniformity and ease of understanding for API consumers For example consider the following API from flask import Flask request jsonifyapp Flask name Dummy dataproducts id name Product A price category Electronics id name Product B price category Clothing id name Product C price category Electronics id name Product D price category Clothing Add more products as needed app route products methods GET def get products try Pagination parameters page int request args get page per page int request args get per page Sorting options sort by request args get sort by id sort order request args get sort order asc Filtering options category request args get category min price float request args get min price max price float request args get max price float inf Validate pagination parameters if page lt or per page lt raise ValueError Invalid pagination parameters Apply filters filtered products filter lambda p p price gt min price and p price lt max price products if category filtered products filter lambda p p category category filtered products Apply sorting sorted products sorted filtered products key lambda p p sort by reverse sort order lower desc Validate page number total products len sorted products total pages total products per page per page if page gt total pages raise ValueError Invalid page number Paginate the results start index page per page end index start index per page paginated products sorted products start index end index return jsonify page page per page per page total pages total pages total products total products products paginated products except ValueError as e return jsonify error str e if name main app run debug True In this example we wrap the logic of the products endpoint in a try except block If any error occurs during the execution we catch it and return a JSON response with an error message and an appropriate status code for client errors Some error scenarios we handle in this example include Invalid pagination parameters page or per page less than Invalid page number exceeding the total number of pages If any of these errors occur an exception is raised with a descriptive error message The exception is caught in the except block and we return a JSON response with the error message and a status code of Bad Request Consider Caching Strategies Implement caching mechanisms to store paginated data or metadata that does not frequently change Caching can help improve performance by reducing the load on the server and reducing the response time for subsequent requests Here are some caching strategies you can consider Page Level Caching Cache the entire paginated response for each page This means caching the data along with the pagination metadata This strategy is suitable when the data is relatively static and doesn t change frequently Result Set Caching Cache the result set of a specific query or combination of query parameters This is useful when the same query parameters are frequently used and the result set remains relatively stable for a certain period Cache the result set and serve it directly for subsequent requests with the same parameters Time Based Caching Set an expiration time for the cache based on the expected freshness of the data For example cache the paginated response for a certain duration such as minutes or hour Subsequent requests within the cache duration can be served directly from the cache without hitting the server Conditional Caching Use conditional caching mechanisms like HTTP ETag or Last Modified headers The server can respond with a Not Modified status if the client s cached version is still valid This reduces bandwidth consumption and improves response time when the data has not changed Reverse Proxy Caching Implement a reverse proxy server like Nginx or Varnish in front of your API server to handle caching Reverse proxies can cache the API responses and serve them directly without forwarding the request to the backend API server This offloads the caching responsibility from the application server and improves performance ConclusionIn conclusion implementing effective API pagination is essential for providing efficient and user friendly access to large datasets By following best practices such as including pagination metadata using stable sorting mechanisms and applying appropriate caching strategies developers can optimize the performance scalability and usability of their paginated APIs By incorporating these best practices into the design and implementation of paginated APIs developers can create highly performant scalable and user friendly interfaces for accessing large datasets With careful consideration of pagination techniques error handling and caching strategies API developers can empower their consumers to efficiently navigate and retrieve the data they need ultimately enhancing the overall API experience 2023-06-06 15:07:06
Apple AppleInsider - Frontpage News App Store Review Guideline updates go after fake apps, bad ads https://appleinsider.com/articles/23/06/06/app-store-review-guideline-updates-go-after-fake-apps-bad-ads?utm_medium=rss App Store Review Guideline updates go after fake apps bad adsApple has refined its App Store Review Guidelines and other developer related documents with changes taking aim against inappropriate advertising apps that impersonate others and Safari extensions In an email to developers sent out on Monday Apple advises it has updated the App Store Review Guidelines the Apple Developer Program License Agreement and the Apple Developer Agreement It says the changes are to support updated policies and upcoming features and to provide clarification Under App Store Review Guidelines there are a total of five changes First on the list adding to part apps that contain ads must also include the ability for users to report any inappropriate or age inappropriate ads Read more 2023-06-06 15:47:38
Apple AppleInsider - Frontpage News Deals: $799 MacBook Air, $1,749 MacBook Pro 14-inch, $399 iPad 10th Generation & more https://appleinsider.com/articles/23/06/06/deals-799-macbook-air-1749-macbook-pro-14-inch-399-ipad-10th-generation-more?utm_medium=rss Deals MacBook Air MacBook Pro inch iPad th Generation amp moreToday s hottest deals include off a Samsung Odyssey gaming monitor off a th Gen iPad off a M Pro Mac mini AppleCare kit off Bose QuietComfort earbuds II and home theater projectors from Save on a M MacBook ProThe AppleInsider crew combs the internet for top notch bargains at online retailers to create a list of amazing deals on trending tech items including discounts on Apple products TVs accessories and other gadgets We post the best deals daily to help you save money Read more 2023-06-06 15:35:53
海外TECH Engadget Apple accidentally released the iOS 17 developer beta to the public https://www.engadget.com/apple-accidentally-released-the-ios-17-developer-beta-to-the-public-155233150.html?src=rss Apple accidentally released the iOS developer beta to the publicApple is supposed to release an iOS public beta in July but the company inadvertently gave users an early peek As AppleInsiderexplains Connor Jewiss and other users have noticed that the iOS developer beta was available to install in the Beta Updates section of Settings whether or not you paid for the necessary account The macOS Sonoma and watchOS previews have been available this way too nbsp We wouldn t count on any of the developer betas being available as we write this As it is you likely won t want to install them These are the first pre release versions available to people outside of Apple and they re the most likely to include bugs and app compatibility issues That could cause problems if you install them on must have devices Unless you re a developer who wants to start preparing app updates you re probably better off waiting until either the public beta or the finished version releases this fall iOS is an iterative upgrade but it adds more than a few features you might appreciate such as live voicemail transcripts easier sharing more intelligent autocorrection and a journaling app MacOS Sonoma adds perks like desktop widgets Safari privacy updates and a Game Mode while watchOS is a significant revamp that centers on quick glance widgets For the most part there s no rush to try them right away This article originally appeared on Engadget at 2023-06-06 15:52:33
海外TECH Engadget The Apple Watch SE is back on sale for $219 https://www.engadget.com/the-apple-watch-se-is-back-on-sale-for-219-153046166.html?src=rss The Apple Watch SE is back on sale for If you re looking to buy a new smartwatch the Apple Watch SE remains one of the better values on the market and right now its mm model is back on sale for at Amazon and Best Buy Target meanwhile has it for a dollar more We ve seen the watch fall to this price a number of times over the past few months but it s still about below the device s average street price and below Apple s MSRP If you want the larger mm model that watch is also off Apple s list price at Note that these offers apply to the watch s Midnight Starlight and Silver finishes While neither of these deals are all time lows ーwe ve seen the mm model very briefly fall to once before ーthey re still strong prices for what you re getting We gave the second gen Apple Watch SE a review score of when it arrived last September and we note it as the best option for first time buyers in our guide to the best smartwatches It s essentially a stripped down version of the Apple Watch Series our top overall pick The big sacrifice is its lack of an always on display mode so you ll have to physically lift up your wrist to check the time or notifications Beyond that its display is slightly smaller it doesn t support fast charging and it lacks more advanced health tracking features like a skin temperature sensor ECG monitor and blood oxygen sensor nbsp Those won t be massive omissions for many people though and the SE keeps the rest of the Apple Watch experience largely intact It runs on the same chipset as the Series it s still water resistant and it gets you access to standard features like heart rate monitoring and fall detection This fall it ll also receive the same watchOS update that Apple announced at WWDC on Monday We still think the Series which is currently available for is the most well rounded wearable for iPhone owners and Apple will invariably launch a new Series watch by the end of the year But for first time buyers or those looking to upgrade from an older Apple Watch on a budget this should be a good deal nbsp Follow EngadgetDeals on Twitter and subscribe to the Engadget Deals newsletter for the latest tech deals and buying advice This article originally appeared on Engadget at 2023-06-06 15:30:46
海外TECH Engadget CNET's new guidelines for AI journalism met with union pushback https://www.engadget.com/cnets-new-guidelines-for-ai-journalism-met-with-union-pushback-152311269.html?src=rss CNET x s new guidelines for AI journalism met with union pushbackNearly seven months after it began publishing machine generated stories without disclosing their true authorship or lack thereof to readers CNET has finally publicly changed its policy on the use of AI in its journalistic endeavors In short stories written by its in house artificial intelligence ーwhich it calls Responsible AI Machine Partner RAMP ーare no more but the specter of AI in its newsroom is far from exorcised nbsp The site indicates however that there are still two broad categories of pursuits where RAMP will be deployed The first which it calls quot Organizing large amounts of information quot provides an example that seems more authorial than that umbrella descriptor lets on quot RAMP will help us sort things like pricing and availability data and present it in ways that tailor information to certain audiences Without an AI assist this volume of work wouldn t be possible quot nbsp The other quot Speeding up certain research and administrative portions of our workflow quot is more troubling quot CNET editors could use AI to help automate some portions of our work so we can focus on the parts that add the most unique value quot the guidelines state quot RAMP may also generate content such as explanatory material based on trusted sources that a human could fact check and edit emphasis ours quot You d be forgiven if that sounds nearly identical to what got CNET into trouble in the first place The venerable tech site first posted an innocuously titled explainer quot What Is a Credit Card Charge Off quot on November under the byline quot CNET Money Staff quot with no further explanation as to its provenance and continued posting dozens more small finance stories under that byline through mid January It was around that time that Futurismdiscovered two important details CNET Money Staff stories were AI generated and much of that work was wildly inaccurate CNET issued corrections on over half of those stories and had by all appearances stopped using these sorts of tools in response to the deserved criticisms they created In the interim the remaining CNET staff publicly announced their intention to unionize with the Writer s Guide of America East Among the more typical areas of concern for a shrinking newsroom during these trying times in the media industry retention severance editorial independence et cetera the bargaining unit also specifically pushed back against the site s intention to keep deploying AI nbsp Based on the union s response on Twitter the guidelines fall well short of the kinds of protections CNET s workers were hoping for quot Before the tool rolls out our union looks forward to negotiating quot they wrote quot How amp what data is retrieved a regular role in testing reevaluating tool right to opt out amp remove bylines a voice to ensure editorial integrity New AI policy CNET affects workers Before the tool rolls out our union looks forward to negotiating how amp what data is retrieved a regular role in testing reevaluating tool right to opt out amp remove bylines a voice to ensure editorial integrity ーCNET Media Workers Union cnetunion June Granted CNET claims it will never deploy RAMP to write full stories though it also denies it ever did so However the new guidelines leave the door open for that possibility as well as the eventuality that it uses AI to generate images or videos promising only that where quot text that originated from our AI tool we ll include that information in a disclosure quot CNET s apparent bullishness on AI and its staff s wariness also arrive against a backdrop of news organizations broadly looking to survive the technology s potential ill effects The New York Times and other media groups began preliminary talks this week to discuss AI s role in disinformation and plagiarism as well as how to ensure fair compensation when authorship becomes murky The prior CNET Money Staff articles have since been updated to reflect the new editorial guidelines Each is credited to a human staff member who has rewritten the story and also lists the name of the overseeing editor Each is now appended with the following note at the bottom quot Editors note An earlier version of this article was assisted by an AI engine This version has been substantially updated by a staff writer quot This sort of basic disclosure is neither difficult nor unusual Including the provenance of information has been one of the core tenants of journalism since well before AI became advanced enough to get a credit on the masthead and The Associated Press has been including such disclosures in its cut and paste level financial beat stories for the better part of a decade On the one hand much of the embarrassment around CNET s gaffe could have been avoided if it had simply warned readers where the text of these stories had come from at the outset But the larger concern remains that unlike AP s use of these tools CNET seems poised to allow RAMP more freedom to do more substantive work the bounds of which are not meaningfully changed by these guidelines Correction June th am ET An earlier version of this story inaccurately described how the altered stories previously written by CNET Money Staff appeared on page nbsp This article originally appeared on Engadget at 2023-06-06 15:23:11
海外TECH Engadget SEC sues Coinbase over alleged violations of securities laws https://www.engadget.com/sec-sues-coinbase-over-alleged-violations-of-securities-laws-151500450.html?src=rss SEC sues Coinbase over alleged violations of securities lawsAnother day another regulatory action against a major cryptocurrency company The Securities and Exchange Commission has sued Coinbase the biggest crypto asset trading platform in the US It claims that Coinbase operated as an unregistered national securities exchange broker and clearing agency The SEC notes that brokers exchanges and clearing agencies are usually separated in traditional securities markets but said Coinbase intertwines their services The agency claimed that by failing to register as a broker national securities exchange or clearing agency Coinbase has prevented investors from having certain protections Those include SEC inspections safeguards against conflicts of interest and recordkeeping requirements The agency argued that Coinbase doesn t qualify for any applicable exemptions from registration for any of the three functions It accused the company of having made billions of dollars from the likes of transaction fees by unlawfully facilitating the buying and selling of crypto asset securities since at least You simply can t ignore the rules because you don t like them or because you d prefer different ones the consequences for the investing public are far too great Gurbir S Grewal the director of the SEC s Division of Enforcement said in a statement As alleged in our complaint Coinbase was fully aware of the applicability of the federal securities laws to its business activities but deliberately refused to follow them While Coinbase s calculated decisions may have allowed it to earn billions it s done so at the expense of investors by depriving them of the protections to which they are entitled It was reported last July that the SEC was investigating Coinbase as to whether the company illegally sold unregistered securities As The New York Times notes news of the agency s complaint comes on the same day that Coinbase s chief legal officer Paul Grewal is set to testify before a congressional committee in relation to a new draft bill that aims to bring in some crypto regulations In March Coinbase said it received a notice from the SEC that agency staff had found potential securities law violations but it was not provided with much detail The company also claimed it provided multiple proposals to the SEC about registration over the course of months all of which the SEC ultimately refused to respond to On Monday the SEC filed charges against Binance and its CEO Changpeng Zhao The agency claimed Binance skirted its own compliance measures and lied to investors and regulators The SEC also claimed that Coinbase mishandled customer funds In addition the agency is involved in the government s case against FTX founder and former CEO Sam Bankman Fried Meanwhile Coinbase is facing regulatory action at the state level A task force comprising state regulators from Alabama California Illinois Kentucky Maryland New Jersey South Carolina Vermont Washington and Wisconsin resulted in a Show Cause Order being issued against the exchange In a statement spotted by Cointelegraph the Alabama Securities Commission accused the company of violating the securities law by offering its staking rewards program accounts to Alabama residents without a registration to offer or sell these securities It gave the company days to show cause why it shouldn t be ordered to cease and desist from selling unregistered securities in the state This article originally appeared on Engadget at 2023-06-06 15:15:00
海外TECH Engadget The best budget laptops for 2023 https://www.engadget.com/best-budget-laptop-150038435.html?src=rss The best budget laptops for Not everyone ones or needs to speed a boatload of money on a new laptop Depending on how you use your notebook ーif you re a student creative professional or anything in between ーit may not be necessary for you to drop thousands on the latest model with top of the line specs Budget laptops do exist even if they don t get as much attention as their flagship counterparts If you re looking to spend only what you absolutely must on your next laptop we ve got a number of top picks for you to consider plus some shopping advice that can help you choose the best budget laptop for you What to look for in a budget laptopFirst we at Engadget consider anything under to be “budget in the laptop space The reason for this is twofold even the most affordable flagship laptops typically start at or more and if you go dramatically lower than that say or less that s where you ll really start to see compromises in performance You ll typically find the best balance of power and price in the to range But in this guide we ll cover top picks at a wide range of prices ーthere are a number of options on the low and high end of the budget spectrum Arguably the biggest thing to look for in a budget laptop is a decent spec sheet You might be able to find options with the latest generation CPU chipsets or you may have to go for one that has a slightly older processor We recommend trying to find a notebook with the most up to date internals as possible but know that if you pick a machine with a CPU that s one generation behind it probably will not significantly affect performance Along with processors you should also consider the amount of memory and storage you need in a daily driver For the former we recommend laptops with at least GB of RAM anything with less than that will have a hard time multitasking and managing all those browser tabs The latter is a bit more personal how much onboard storage you need really depends on how many apps files photos documents and more you will save locally As a general rule of thumb try to go for a laptop that has at least a GB SSD this only goes for macOS and Windows machines as Chromebooks are a bit different That should give you enough space for programs and files plus room for future operating system updates After determining the best performance you can get while sticking to your budget it s also worth examining a few different design aspects We recommend picking a machine with a mostly metal body a screen that has at least a p resolution and a keyboard and trackpad area that s relatively spacious Any laptop worth purchasing will have a built in webcam but most of them top out at p A few of the latest models have p webcams but you may want to consider a standalone peripheral if you spend a ton of time on Zoom meetings Be sure to check out the port situation as well Many laptops closer to will have fewer ports than their more affordable counterparts as counterintuitive as that may seem You ll find at least one or two USB C ports on the newest machines which means you may need a separate dongle if you frequently have to connect to SD cards A note about refurbished laptopsRefurbished laptops are another option to consider if you need a new machine and don t want to spend a ton of money Buying refurbished tech can be tricky if you re unfamiliar with a brand s or merchant s policies surrounding what they classify as “refurbished But it s not impossible ーfor laptops we recommend going directly to the manufacturer for refurbished devices Apple Dell and Microsoft all have official refurbishment processes that their devices go through before they re put back on the market that verifies the machines work properly and are in good condition Third party retailers like Amazon and Walmart also have their own refurbishment programs for laptops and other gadgets as well The best budget laptopsBest overall MacBook Air MThere s a reason Apple kept the MacBook Air M in its lineup even after coming out with the inch and inch Air M laptops The first machine with Apple s custom system on a chip the Air M was released at the end of and proved that the company didn t need Intel to power its notebooks anymore The M processor gave the Air blazing fast performance with a responsiveness akin to that of an iPad Pro That hasn t changed even after the launch of the M chipset and the latest Air powered by it You re still going to get impressive performance from the MacBook Air M that will be just right for most people as a daily driver The Air M has the classic wedge design we ve seen in this family of notebooks for years which some will appreciate It may not be the refined profile that the M machine has but it s still thin and light and since it lacks a fan it ll be super quiet as well The inch Retina display is lovely and it s accompanied by a comfortable keyboard sans TouchBar and a spacious trackpad Battery life clocked in at nearly hours in our testing which will be more than enough for a full day s work It may be at the top end of our budget price range starting at but it will be money well spent Also we ve frequently seen the MacBook Air M drop to or when it goes on sale at Amazon and other retailers Read our full review of the Apple MacBook Air MBest budget Windows laptop HP Pavilion Aero If you like the general aesthetics of machines like Dell s XPS but don t want to pay or more the HP Pavilion Aero is your best bet We gave it a score of in our review and compared it to Dell s flagship laptop It s certainly not as sleek as that machine but it comes pretty close with its angled profile pound weight and its anti glare inch display Despite its keyboard being a little cramped it s a solid typing machine and we appreciate all of its connectivity options one USB C port two USB A ports an HDMI connector and a headphone jack You can currently pick an Aero up for as low as but they have gone on sale for even less All of the prebuilt models available from HP directly come with Ryzen processors and you can customize the laptop to have up to a Ryzen CPU GB of RAM and a TB SSD Read our full review of the HP Pavilion Aero Best Chromebook Lenovo IdeaPad Flex iIt s been a couple of years since we named Lenovo s IdeaPad Flex i our favorite Chromebook and it remains our top pick today That s because it still has the best mix of specs and features that will suit most Chrome OS lovers It runs on an th generation Core i processor has GB of RAM and GB of storage Plus its bright inch p display is great for working in Google Docs and streaming on Netflix While not a standout in the design department this convertible is relatively lightweight and we appreciate that it comes with a backlit keyboard ーsomething you don t often see in laptops at this price point It should also last around eight hours on a single charge or long enough to get you through a typical work day You re getting a solid port collection here too two USB Cs one USB A a microSD card slot and a headphone jack All of that keeps the Flex i ahead of the Chromebook pack and its affordable price tag makes it even better Read our full review of the Lenovo IdeaPad Flex iBest under Acer Aspire Acer s Aspire family is a solid Windows option if you have less than to spend on a new laptop The most recent models hit a good middle ground for most people running on Intel th gen CPUs and supporting up to GB of RAM and up to GB of storage Of course the higher specs you get the more expensive the machine will be ーnot all Aspire laptops come in at under But you can currently pick up a model with a inch p display Core i processor GB of RAM and GB of storage for about or less if it s on sale Design is pretty basic here but you do get a handy number pad and a variety of ports including one USB C connector three USB A ports and an Ethernet port We also appreciate that the latest Aspire s support WiFi and Acer upped the estimated battery life to hours This article originally appeared on Engadget at 2023-06-06 15:00:38
Cisco Cisco Blog Cloud-delivered OT services to simplify and scale IT for industrial networks https://feedpress.me/link/23532/16165655/cloud-delivered-ot-services-to-simplify-and-scale-it-for-industrial-networks Cloud delivered OT services to simplify and scale IT for industrial networksDeliver simplicity and scale for IT and operations Leverage the network as a platform for innovation to ensure secure operations and deliver enable access to operational devices 2023-06-06 15:30:11
Cisco Cisco Blog Fixing Things Before They Break: A More Proactive Network Mantra https://feedpress.me/link/23532/16165656/fixing-things-before-they-break-a-more-proactive-network-mantra management 2023-06-06 15:30:06
Cisco Cisco Blog Unleashing Innovation Starts with Unifying Experiences https://feedpress.me/link/23532/16165657/unleashing-innovation-starts-with-unifying-experiences Unleashing Innovation Starts with Unifying ExperiencesCisco Networking Cloud is our vision to simplify IT everywhere at every scale Learn how we re enabling a unified management experience platform on your terms 2023-06-06 15:30:03
Cisco Cisco Blog Going Beyond “Next Generation” Network Security https://feedpress.me/link/23532/16165658/going-beyond-next-generation-network-security-cisco-platform-approach Going Beyond “Next Generation Network SecurityLearn how Cisco Security s firewall innovations unify network security bringing best in class data center security muticloud security and management together 2023-06-06 15:29:31
Cisco Cisco Blog Simplifying How Customers Unleash the Power of Our Platforms https://feedpress.me/link/23532/16165659/simplifying-how-customers-unleash-the-power-of-our-platforms Simplifying How Customers Unleash the Power of Our PlatformsDelivering a simpler platform experience starts now Discover how our new platform experience can enable operational simplicity efficiency and reliability to transform your business 2023-06-06 15:29:29
海外科学 NYT > Science A Summer Without Arctic Sea Ice Could Come a Decade Sooner Than Expected https://www.nytimes.com/2023/06/06/climate/arctic-sea-ice-melting.html A Summer Without Arctic Sea Ice Could Come a Decade Sooner Than ExpectedIn a new study scientists found that the climate milestone could come about a decade sooner than anticipated even if planet warming emissions are gradually reduced 2023-06-06 15:03:16
海外科学 NYT > Science Merck Sues Over Medicare Drug-Price Negotiation Law https://www.nytimes.com/2023/06/06/business/merck-medicare-drug-prices.html prices 2023-06-06 15:24:13
金融 金融庁ホームページ 金融庁広報誌「アクセスFSA」第238号を発行しました。 https://www.fsa.go.jp/access/index.html 金融庁 2023-06-06 17:00:00
ニュース BBC News - Home Criminal investigation launched over royal escort crash https://www.bbc.co.uk/news/uk-england-london-65826561?at_medium=RSS&at_campaign=KARANGA edinburgh 2023-06-06 15:51:47
ニュース BBC News - Home Bournemouth beach boat operations suspended after deaths https://www.bbc.co.uk/news/uk-england-dorset-65823704?at_medium=RSS&at_campaign=KARANGA bournemouth 2023-06-06 15:54:02
ニュース BBC News - Home PGA Tour & DP World Tour agree shock merger with LIV Golf to end split in golf https://www.bbc.co.uk/sport/golf/65825327?at_medium=RSS&at_campaign=KARANGA PGA Tour amp DP World Tour agree shock merger with LIV Golf to end split in golfThe PGA Tour and DP World Tour agree to merge with Saudi Arabian backed circuit LIV Golf in a deal that ends the acrimonious split in the game 2023-06-06 15:48:02
ニュース BBC News - Home Ian Blackford to stand down as SNP MP at next election https://www.bbc.co.uk/news/uk-scotland-scotland-politics-65827165?at_medium=RSS&at_campaign=KARANGA election 2023-06-06 15:49:53
ニュース BBC News - Home Cuba Gooding Jr settles rape lawsuit ahead of civil trial https://www.bbc.co.uk/news/entertainment-arts-65825715?at_medium=RSS&at_campaign=KARANGA trial 2023-06-06 15:29:51
ニュース BBC News - Home CBI: Scandal-hit business group wins survival vote https://www.bbc.co.uk/news/business-65809069?at_medium=RSS&at_campaign=KARANGA survival 2023-06-06 15:44:56
ニュース BBC News - Home 'Thicko, cheat, underage drinker' - Key extracts from Prince Harry's statement https://www.bbc.co.uk/news/uk-65819707?at_medium=RSS&at_campaign=KARANGA gathering 2023-06-06 15:32:14
ニュース BBC News - Home Love Island 2023 summer launch loses a million TV viewers https://www.bbc.co.uk/news/entertainment-arts-65818567?at_medium=RSS&at_campaign=KARANGA island 2023-06-06 15:40:58
ニュース BBC News - Home French Open 2023: Aryna Sabalenka 'does not support' Belarusian president Alexander Lukashenko https://www.bbc.co.uk/sport/tennis/65826192?at_medium=RSS&at_campaign=KARANGA French Open Aryna Sabalenka x does not support x Belarusian president Alexander LukashenkoAryna Sabalenka says she does not support Belarusian president Alexander Lukashenko right now because of her nation s support of Russia s war in Ukraine 2023-06-06 15:33:04
ニュース BBC News - Home Ukraine dam: Thousands flee floods after dam collapse near Nova Kakhovka https://www.bbc.co.uk/news/world-europe-65819591?at_medium=RSS&at_campaign=KARANGA ukraine 2023-06-06 15:45:23
ニュース BBC News - Home Prince Harry: British press and government at rock bottom https://www.bbc.co.uk/news/uk-politics-65822218?at_medium=RSS&at_campaign=KARANGA sussex 2023-06-06 15:09:38
GCP Cloud Blog What’s new with Google Cloud https://cloud.google.com/blog/topics/inside-google-cloud/whats-new-google-cloud/ What s new with Google CloudWant to know the latest from Google Cloud Find it here in one handy location Check back regularly for our newest updates announcements resources events learning opportunities and more  Tip  Not sure where to find what you re looking for on the Google Cloud blog Start here  Google Cloud blog Full list of topics links and resources Week of June June Global External HTTP S Load Balancer and Cloud CDN s advanced traffic management using flexible pattern matching is now GA This allows you to use wildcards anywhere in your path matcher You can use this to customize origin routing for different types of traffic request and response behaviors and caching policies In addition you can now use results from your pattern matching to rewrite the path that is sent to the origin Dataform is Generally Available Dataform offers an end to end experience to develop version control and deploy SQL pipelines in BigQuery Using a single web interface data engineers and data analysts of all skill levels can build production grade SQL pipelines in BigQuery while following software engineering best practices such as version control with Git CI CD and code lifecycle management Learn more The Public Preview of Frontend Mutual TLS Support on Global External HTTPS Load Balancing is now available Now you can use Global External HTTPS Load Balancing to offload Mutual TLS authentication for your workloads This includes client mTLS for Apigee X Northbound Traffic using Global HTTPS Load Balancer FinOps from the field How to build a FinOps Roadmap In a world where cloud services have become increasingly complex how do you take advantage of the features but without the nasty bill shock at the end Learn how to build your own FinOps roadmap step by step with helpful tips and tricks from FinOps workshops Google has completed with customers Week of May June Google Cloud Deploy The price of an active delivery pipeline is reduced Also single target delivery pipelines no longer incur a charge Underlying service charges continue to apply See Pricing Page for more details Week of May Security Command Center SCC Premium pricing for project level activation is now lower for customers who use SCC to secure Compute Engine GKE Autopilot App Engine and Cloud SQL Please see our updated rate card Also we have expanded the number of finding types available for project level Premium activations to help make your environment more secure Learn more Vertex AI Embeddings for Text Grounding LLMs made easy Many people are now starting to think about how to bring Gen AI and large language models LLMs to production services You may be wondering How to integrate LLMs or AI chatbots with existing IT systems databases and business data We have thousands of products How can I let LLM memorize them all precisely or How to handle the hallucination issues in AI chatbots to build a reliable service Here is a quick solution grounding with embeddings and vector search What is grounding What are embedding and vector search In this post we will learn these crucial concepts to build reliable Gen AI services for enterprise use with live demos and source code Week of May Introducing the date time selector in Log Analytics in Cloud Logging You can now easily customize the date and time range of your queries in the Log Analytics page by using the same date time range selector used in Logs Explorer Metrics Explorer and other Cloud Ops products There are several time range options such as preset times custom start and end times and relative time ranges For more information see Filter by time in the Log Analytics docs Cloud Workstations is now GA We are thrilled to announce the general availability of Cloud Workstations with a list of new enhanced features providing fully managed integrated development environments IDEs on Google Cloud Cloud Workstations enables faster developer onboarding and increased developer productivity while helping support your compliance requirements with an enhanced security posture Learn MoreWeek of May Google is partnering with regional carriers Chunghwa Telecom Innove subsidiary of Globe Group and AT amp T to deliver the TPU Taiwan Philippines U S cable system ーconnecting Taiwan Philippines Guam and California ーto support growing demand in the APAC region We are committed to providing Google Cloud customers with a resilient high performing global network NEC is the supplier and the system is expected to be ready for service in Introducing BigQuery differential privacy SQL building blocks that analysts and data scientists can use to anonymize their data We are also partnering with Tumult Labs to help Google Cloud customers with their differential privacy implementations Scalable electronic trading on Google Cloud A business case with BidFX Working with Google Cloud BidFX has been able to develop and deploy a new product called Liquidity Provision Analytics “LPA launching to production within roughly six months to solve the transaction cost analysis challenge in an innovative way LPA will be offering features such as skew detection for liquidity providers execution time optimization pricing comparison top of book analysis and feedback to counterparties Read more here AWS EC VMs discovery and assessment mFit can discover EC VMs inventory in your AWS region and collect guest level information from multiple VMs to provide technical fit assessment for modernization  See demo video Generate assessment report in Microsoft Excel file mFit can generate detailed assessment report in Microsoft Excel XLSX format which can handle large amounts of VMs in a single report few s which an HTML report might not be able to handle Regulatory Reporting Platform Regulatory reporting remains a challenge for financial services firms We share our point of view on the main challenges and opportunities in our latest blog accompanied by an infographic and a customer case study from ANZ Bank We also wrote a white paper for anyone looking for a deeper dive into our Regulatory Reporting Platform Week of May Microservices observability is now generally available for C Go and Java This release includes a number of new features and improvements making it easier than ever to monitor and troubleshoot your microservices applications Learn more on our user guide Google Cloud Deploy Google Cloud Deploy now supports Skaffold as the default Skaffold version for all target types Release Notes Cloud Build You can now configure Cloud Build to continue executing a build even if specified steps fail This feature is generally available Learn more hereWeek of April General Availability Custom Modules for Security Health Analytics is now generally available Author custom detective controls in Security Command Center using the new custom module capability Next generation Confidential VM is now available in Private Preview with a Confidential Computing technology called AMD Secure Encrypted Virtualization Secure Nested Paging AMD SEV SNP on general purpose ND machines Confidential VMs with AMD SEV SNP enabled builds upon memory encryption and adds new hardware based security protections such as strong memory integrity encrypted register state thanks to AMD SEV Encrypted State SEV ES and hardware rooted remote attestation Sign up here Selecting Tier networking for your Compute Engine VM can give you the bandwidth you need for demanding workloads Check out this blog on Increasing bandwidth to Compute Engine VMs with TIER networking Week of April Use Terraform to manage Log Analytics in Cloud Logging  You can now configure Log Analytics on Cloud Logging buckets and BigQuery linked datasets by using the following Terraform modules Google logging project bucket configgoogle logging linked datasetWeek of April Assured Open Source Software is generally available for Java and Python ecosystems Assured OSS is offered at no charge and provides an opportunity for any organization that utilizes open source software to take advantage of Google s expertise in securing open source dependencies BigQuery change data capture CDC is now in public preview BigQuery CDC provides a fully managed method of processing and applying streamed UPSERT and DELETE operations directly into BigQuery tables in real time through the BigQuery Storage Write API This further enables the real time replication of more classically transactional systems into BigQuery which empowers cross functional analytics between OLTP and OLAP systems Learn more here Week of April Now Available Google Cloud Deploy now supports canary release as a deployment strategy This feature is supported in Preview Learn moreGeneral Availability Cloud Run services as backends to Internal HTTP S Load Balancers and Regional External HTTP S Load Balancers Internal load balancers allow you to establish private connectivity between Cloud Run services and other services and clients on Google Cloud on premises or on other clouds In addition you get custom domains tools to migrate traffic from legacy services Identity aware proxy support and more Regional external load balancer as the name suggests is designed to reside in a single region and connect with workloads only in the same region thus helps you meet your regionalization requirements Learn more New Visualization tools for Compute Engine Fleets TheObservability tab in the Compute Engine console VM List page has reached General Availability The new Observability tab is an easy way to monitor and troubleshoot the health of your fleet of VMs Datastream for BigQuery is Generally Available  Datastream for BigQuery is generally available offering a unique truly seamless and easy to use experience that enables near real time insights in BigQuery with just a few steps Using BigQuery s newly developed change data capture CDC and Storage Write API s UPSERT functionality Datastream efficiently replicates updates directly from source systems into BigQuery tables in real time You no longer have to waste valuable resources building and managing complex data pipelines self managed staging tables tricky DML merge logic or manual conversion from database specific data types into BigQuery data types Just configure your source database connection type and destination in BigQuery and you re all set Datastream for BigQuery will backfill historical data and continuously replicate new changes as they happen Now available  Build an analytics lakehouse on Google Cloud whitepaper The analytics lakehouse combines the benefits of data lakes and data warehouses without the overhead of each In this paper we discuss the end to end architecture which enable organizations to extract data in real time regardless of which cloud or datastore the data reside in use the data in aggregate for greater insight and artificial intelligence AI all with governance and unified access across teams  Download now Week of March Faced with strong data growth Squarespace made the decision to move away from on premises Hadoop to a cloud managed solution for its data platform Learn how they reduced the number of escalations by with the analytics lakehouse on Google Cloud  Read nowLast chance Register to attend Google Data Cloud amp AI Summit  Join us on Wednesday March at AM PDT PM EDT to discover how you can use data and AI to reveal opportunities to transform your business and make your data work smarter Find out how organizations are using Google Cloud data and AI solutions to transform customer experiences boost revenue and reduce costs  Register today for this no cost digital event New BigQuery editions flexibility and predictability for your data cloud  At the Data Cloud amp AI Summit we announced BigQuery pricing editionsーStandard Enterprise and Enterprise Plusーthat allow you to choose the right price performance for individual workloads Along with editions we also announced autoscaling capabilities that ensure you only pay for the compute capacity you use and a new compressed storage billing model that is designed to reduce your storage costs Learn more about latest BigQuery innovations and register for the upcoming BigQuery roadmap session on April Introducing Looker Modeler A single source of truth for BI metrics  At the Data Cloud amp AI Summit we introduced a standalone metrics layer we call Looker Modeler available in preview in Q With Looker Modeler organizations can benefit from consistent governed metrics that define data relationships and progress against business priorities and consume them in BI tools such as Connected Sheets Looker Studio Looker Studio Pro Microsoft Power BI Tableau and ThoughtSpot Bucket based log based metrics ーnow generally available ーallow you to track visualize and alert on important logs in your cloud environment from many different projects or across the entire organization based on what logs are stored in a log bucket Week of March Chronicle Security Operations Feature Roundup Bringing a modern and unified security operations experience to our customers is and has been a top priority with the Google Chronicle team We re happy to show continuing innovation and even more valuable functionality In our latest release roundup we ll highlight a host of new capabilities focused on delivering improved context collaboration and speed to handle alerts faster and more effectively Learn how our newest capabilities enable security teams to do more with less here Announcing Google s Data Cloud amp AI Summit March th  Can your data work smarter How can you use AI to unlock new opportunities Join us on Wednesday March to gain expert insights new solutions and strategies to reveal opportunities hiding in your company s data Find out how organizations are using Google Cloud data and AI solutions to transform customer experiences boost revenue and reduce costs  Register today for this no cost digital event Artifact Registry Feature Preview Artifact Registry now supports immutable tags for Docker repositories If you enable this setting an image tag always points to the same image digest including the default latest tag This feature is in Preview Learn moreWeek of March A new era for AI and Google Workspace  Google Workspace is using AI to become even more helpful starting with new capabilities in Docs and Gmail to write and refine content Learn more Building the most open and innovative AI ecosystem  In addition to the news this week on AI products Google Cloud has also announced new partnerships programs and resources This includes bringing bringing the best of Google s infrastructure AI products and foundation models to partners at every layer of the AI stack chipmakers companies building foundation models and AI platforms technology partners enabling companies to develop and deploy machine learning ML models app builders solving customer use cases with generative AI and global services and consulting firms that help enterprise customers implement all of this technology at scale Learn more From Microbrows to Microservices  Ulta Beauty is building their digital store of the future but to maintain control over their new modernized application they turned to Anthos and GKE Google Cloud s managed container services to provide an eCommerce experience as beautiful as their guests Read our blog to see how a newly minted Cloud Architect learnt Kubernetes and Google Cloud to provide the best possible architecture for his developers Learn more Now generally available understand and trust your data with Dataplex data lineage a fully managed Dataplex capability that helps you understand how data is sourced and transformed within the organization Dataplex data lineage automatically tracks data movement across BigQuery BigLake Cloud Data Fusion Preview and Cloud Composer Preview eliminating operational hassles around manual curation of lineage metadata Learn more here Rapidly expand the reach of Spanner databases with read only replicas and zero downtime moves Configurable read only replicas let you add read only replicas to any Spanner instance to deliver low latency reads to clients in any geography Alongside Spanner s zero downtime instance move service you have the freedom to move your production Spanner instances from any configuration to another on the fly with zero downtime whether it s regional multi regional or a custom configuration with configurable read only replicas Learn more here Week of March Automatically blocking project SSH keys in Dataflow is now GA This service option allows Dataflow users to prevent their Dataflow worker VMs from accepting SSH keys that are stored in project metadata and results in improved security Getting started is easy enable the block project ssh keys service option while submitting your Dataflow job Celebrate International Women s Day Learn about the leaders driving impact at Google Cloud and creating pathways for other women in their industries Read more Google Cloud Deploy now supports Parallel Deployment to GKE and Cloud Run workloads This feature is in Preview  Read more Sumitovant doubles medical research output in one year using LookerSumitovant is a leading biopharma research company that has doubled their research output in one year alone By leveraging modern cloud data technologies Sumitovant supports their globally distributed workforce of scientists to develop next generation therapies using Google Cloud s Looker for trusted self service data research To learn more about Looker check out Week of Feb Mar Add geospatial intelligence to your Retail use cases by leveraging the CARTO platform on top of your data in BigQueryLocation data will add a new dimension to your Retail use cases like site selection geomarketing and logistics and supply chain optimization Read more about the solution and various customer implementations in the CARTO for Retail Reference Guide and see a demonstration in this blog Google Cloud Deploy support for deployment verification is now GA  Read more or Try the DemoWeek of Feb Feb Logs for Network Load Balancing and logs for Internal TCP UDP Load Balancingare now GA Logs are aggregated per connection and exported in near real time providing useful information such as tuples of the connection received bytes and sent bytes for troubleshooting and monitoring the pass through Google Cloud Load Balancers Further customers can include additional optional fields such as annotations for client side and server side GCE and GKE resources to obtain richer telemetry The newly published Anthos hybrid cloud architecture reference design guideprovides opinionated guidance to deploy Anthos in a hybrid environment to address some common challenges that you might encounter Check out the architecture reference design guidehere to accelerate your journey to hybrid cloud and containerization Week of Feb Feb Deploy PyTorch models on Vertex AI in a few clicks with prebuilt PyTorch serving containers which means less code no need to write Dockerfiles and faster time to production Confidential GKE Nodes on Compute Optimized CD VMs are now GA Confidential GKE Nodes help to increase the security of your GKE clusters by leveraging hardware to ensure your data is encrypted in memory helping to defend against accidental data leakage malicious administrators and “curious neighbors Getting started is easy as your existing GKE workloads can run confidentially with no code changes required Announcing Google s Data Cloud amp AI Summit March th Can your data work smarter How can you use AI to unlock new opportunities Register for Google Data Cloud amp AI Summit a digital event for data and IT leaders data professionals developers and more to explore the latest breakthroughs Join us on Wednesday March to gain expert insights new solutions and strategies to reveal opportunities hiding in your company s data Find out how organizations are using Google Cloud data and AI solutions to transform customer experiences boost revenue and reduce costs  Register today for this no cost digital event Running SAP workloads on Google Cloud Upgrade to our newly released Agent for SAP to gain increased visibility into your infrastructure and application performance The new agent consolidates several of our existing agents for SAP workloads which means less time spent on installation and updates and more time for making data driven decisions In addition there is new optional functionality that powers exciting products like Workload Manager a way to automatically scan your SAP workloads against best practices Learn how to install or upgrade the agent here Leverege uses BigQuery as a key component of its data and analytics pipeline to deliver innovative IoT solutions at scale As part of the Built with BigQuery program this blog post goes into detail about Leverege IoT Stack that runs on Google Cloud to power business critical enterprise IoT solutions at scale  Download white paper Three Actions Enterprise IT Leaders Can Take to Improve Software Supply Chain Security to learn how and why high profile software supply chain attacks like SolarWinds and Logj happened the key lessons learned from these attacks as well as actions you can take today to prevent similar attacks from happening to your organization Week of Feb Feb Immersive Stream for XRleverages Google Cloud GPUs to host render and stream high quality photorealistic experiences to millions of mobile devices around the world and is now generally available Read more here Reliable and consistent data presents an invaluable opportunity for organizations to innovate make critical business decisions and create differentiated customer experiences But poor data quality can lead to inefficient processes and possible financial losses Today we announce new Dataplex features automatic data quality AutoDQ and data profiling available in public preview AutoDQ offers automated rule recommendations built in reporting and serveless execution to construct high quality data  Data profiling delivers richer insight into the data by identifying its common statistical characteristics Learn more Cloud Workstations now supports Customer Managed Encryption Keys CMEK which provides user encryption control over Cloud Workstation Persistent Disks Read more Google Cloud Deploy now supports Cloud Run targets in General Availability Read more Learn how to use NetApp Cloud Volumes Service as datastores for Google Cloud VMware Engine for expanding storage capacity Read moreWeek of Jan Feb Oden Technologies uses BigQuery to provide real time visibility efficiency recommendations and resiliency in the face of network disruptions in manufacturing systems As part of the Built with BigQuery program this blog post describes the use cases challenges solution and solution architecture in great detail Manage table and column level access permissions using attribute based policies in Dataplex Dataplex attribute store provides a unified place where you can create and organize a Data Class hierarchy to classify your distributed data and assign behaviors such as Table ACLs and Column ACLs to the classified data classes Dataplex will propagate IAM Roles to tables across multiple Google Cloud projects according to the attribute s assigned to them and a single merged policy tag to columns according to the attribute s attached to them  Read more Lytics is a next generation composableCDP that enables companies to deploy a scalable CDP around their existing data warehouse lakes As part of the Built with BigQuery program for ISVs Lytics leverages Analytics Hub to launch secure data sharing and enrichment solution for media and advertisers This blog post goes over Lytics Conductor on Google Cloud and its architecture in great detail Now available in public preview Dataplex business glossary offers users a cloud native way to maintain and manage business terms and definitions for data governance establishing consistent business language improving trust in data and enabling self serve use of data Learn more here Security Command Center SCC Google Cloud s native security and risk management solution is now available via self service to protect individual projects from cyber attacks It s never been easier to secure your Google Cloud resources with SCC Read our blog to learn more To get started today go to Security Command Center in the Google Cloud console for your projects Global External HTTP S Load Balancer and Cloud CDN now support advanced traffic management using flexible pattern matching in public preview This allows you to use wildcards anywhere in your path matcher You can use this to customize origin routing for different types of traffic request and response behaviors and caching policies In addition you can now use results from your pattern matching to rewrite the path that is sent to the origin Run large pods on GKE Autopilot with the Balanced compute class When you need computing resources on the larger end of the spectrum we re excited that the Balanced compute class which supports Pod resource sizes up to vCPU and GiB is now GA Week of Jan Jan Starting with Anthos version Google supports each Anthos minor version for months after the initial release of the minor version or until the release of the third subsequent minor version whichever is longer We plan to have Anthos minor release three times a year around the months of April August and December in with a monthly patch release for example z in version x y z for supported minor versions For more information read here Anthos Policy Controller enables the enforcement of fully programmable policies for your clusters across the environments We are thrilled to announce the launch of our new built in Policy Controller Dashboard a powerful tool that makes it easy to manage and monitor the policy guardrails applied to your Fleet of clusters New policy bundles are available to help audit your cluster resources against kubernetes standards industry standards or Google recommended best practices The easiest way to get started with Anthos Policy Controller is to just install Policy controller and try applying a policy bundle to audit your fleet of clusters against a standard such as CIS benchmark Dataproc is an important service in any data lake modernization effort Many customers begin their journey to the cloud by migrating their Hadoop workloads to Dataproc and continue to modernize their solutions by incorporating the full suite of Google Cloud s data offerings Check out this guide that demonstrates how you can optimize Dataproc job stability performance and cost effectiveness Eventarc adds support for new direct events from the following Google services in Preview API Gateway Apigee Registry BeyondCorp Certificate Manager Cloud Data Fusion Cloud Functions Cloud Memorystore for Memcached Database Migration Datastream Eventarc Workflows This brings the total pre integrated events offered in Eventarc to over events from Google services and third party SaaS vendors  mFit release adds support for JBoss and Apache workloads by including fit analysis and framework analytics for these workload types in the assessment report See the release notes for important bug fixes and enhancements Google Cloud Deploy Google Cloud Deploy now supports Skaffold version  Release notesCloud Workstations Labels can now be applied to Cloud Workstations resources  Release notes Cloud Build Cloud Build repositories nd gen lets you easily create and manage repository connections not only through Cloud Console but also through gcloud and the Cloud Build API Release notesWeek of Jan Jan Cloud CDN now supports private origin authentication for Amazon Simple Storage Service Amazon S buckets and compatible object stores in Preview This capability improves security by allowing only trusted connections to access the content on your private origins and preventing users from directly accessing it Week of Jan Jan Revionics partnered with Google Cloud to build a data driven pricing platform for speed scale and automation with BigQuery Looker and more As part of the Built with BigQuery program this blog post describes the use cases problems solved solution architecture and key outcomes of hosting Revionics product Platform Built for Change on Google Cloud Comprehensive guide for designing reliable infrastructure for your workloads in Google Cloud The guide combines industry leading reliability best practices with the knowledge and deep expertise of reliability engineers across Google Understand the platform level reliability capabilities of Google Cloud the building blocks of reliability in Google Cloud and how these building blocks affect the availability of your cloud resources Review guidelines for assessing the reliability requirements of your cloud workloads Compare architectural options for deploying distributed and redundant resources across Google Cloud locations and learn how to manage traffic and load for distributed deployments Read the full blog here GPU Pods on GKE Autopilot are now generally available Customers can now run ML training inference video encoding and all other workloads that need a GPU with the convenience of GKE Autopilot s fully managed Kubernetes environment Kubernetes v is now generally available on GKE GKE customers can now take advantage of the many new features in this exciting release This release continues Google Cloud s goal of making Kubernetes releases available to Google customers within days of the Kubernetes OSS release Event driven transfer for Cloud Storage Customers have told us they need asynchronous scalable service to replicate data between Cloud Storage buckets for a variety of use cases including aggregating data in a single bucket for data processing and analysis keeping buckets across projects regions continents in sync etc Google Cloud now offers Preview support for event driven transfer serverless real time replication capability to move data from AWS S to Cloud Storage and copy data between multiple Cloud Storage buckets Read the full blog here Pub Sub Lite now offers export subscriptions to Pub Sub This new subscription type writes Lite messages directly to Pub Sub no code development or Dataflow jobs needed Great for connecting disparate data pipelines and migration from Lite to Pub Sub See here for documentation 2023-06-06 16:00:00

コメント

このブログの人気の投稿

投稿時間:2021-06-17 22:08:45 RSSフィード2021-06-17 22:00 分まとめ(2089件)

投稿時間:2021-06-20 02:06:12 RSSフィード2021-06-20 02:00 分まとめ(3871件)

投稿時間:2021-06-17 05:05:34 RSSフィード2021-06-17 05:00 分まとめ(1274件)