TECH |
Engadget Japanese |
2012年2月18日、初のXマウント採用ミラーレス機「FUJIFILM X-Pro1」が発売されました:今日は何の日? |
https://japanese.engadget.com/today18-203057827.html
|
fujifilmxpro |
2022-02-17 20:30:57 |
AWS |
AWS Security Blog |
Introducing s2n-quic, a new open-source QUIC protocol implementation in Rust |
https://aws.amazon.com/blogs/security/introducing-s2n-quic-open-source-protocol-rust/
|
Introducing sn quic a new open source QUIC protocol implementation in RustAt Amazon Web Services AWS security high performance and strong encryption for everyone are top priorities for all our services With these priorities in mind less than a year after QUIC ratification in the Internet Engineering Task Force IETF we are introducing support for the QUIC protocol which can boost performance for web applications that … |
2022-02-17 20:05:43 |
AWS |
AWS |
Fueling innovation at Formula 1 | Amazon Web Services |
https://www.youtube.com/watch?v=LfgUZSvpzUM
|
Fueling innovation at Formula Amazon Web ServicesHear from Rob Smedley Director of Data Systems at Formula as he shares how working with AWS has helped cement F as the most technologically driven sport on the planet using data to fuel new innovations and fan engagement Learn more about AWS powers F Insights at Subscribe More AWS videos More AWS events videos ABOUT AWSAmazon Web Services AWS is the world s most comprehensive and broadly adopted cloud platform offering over fully featured services from data centers globally Millions of customers ーincluding the fastest growing startups largest enterprises and leading government agencies ーare using AWS to lower costs become more agile and innovate faster F Formula AWS AmazonWebServices CloudComputing |
2022-02-17 20:46:35 |
AWS |
AWS Security Blog |
Introducing s2n-quic, a new open-source QUIC protocol implementation in Rust |
https://aws.amazon.com/blogs/security/introducing-s2n-quic-open-source-protocol-rust/
|
Introducing sn quic a new open source QUIC protocol implementation in RustAt Amazon Web Services AWS security high performance and strong encryption for everyone are top priorities for all our services With these priorities in mind less than a year after QUIC ratification in the Internet Engineering Task Force IETF we are introducing support for the QUIC protocol which can boost performance for web applications that … |
2022-02-17 20:05:43 |
Google |
Official Google Blog |
Our new $100 million Google Career Certificates Fund |
https://blog.google/outreach-initiatives/grow-with-google/career-certificates-fund/
|
Our new million Google Career Certificates FundEditor s note Today Google CEO Sundar Pichai announced a new million Google Career Certificates Fundat an event with U S Assistant Secretary of Commerce for Economic Development Alejandra Castillo and the CEOs of Social Finance Merit America and Year Up Below is an edited transcript of his remarks One of the best parts of my job is visiting the communities where Google operates These visits remind me that America is full of people who want to work hard and contribute to their communities That sense of purpose and optimism is what brought me to America nearly years ago And it s what drew me to Google and its mission to organize the world s information and make it universally accessible and useful We are a company of technology optimists We believe in what people can do with technology to improve their lives and the lives of others That s what inspired us to launch Grow with Google in to help all Americans access training to grow their skills careers and businesses What we ve learned over the last five years is what can be accomplished when private sector companies like ours come together with public sector institutions and nonprofit partners Our digital skills program is one example Together we ve helped train eight million Americans in all states Another example is our Google Career Certificates Seventy thousand Americans have now completed these certificates They prepare people for high paying high growth jobs in fields like data analytics IT support project management and user experience design They are available to anyone no college degree required Seventy five percent of graduates report seeing a positive impact on their career within six months including a raise or a new job That includes Natalie Burns who I met in and who is here with us today Natalie earned her Google IT Support certificate while attending community college in Texas She got a job in cybersecurity and ーI m told ーa significant pay increase Congrats Natalie We want to help more people access these Certificates especially in underserved communities That s why I m excited to announce a new million Google Career Certificates Fund The goal is to enable Social Finance to reach more than American workers This investment in America s future has the potential to drive billion in wage gains This fund is a new kind of financing model We ll invest Google capital and Google org grants and provide our Career Certificate program Social Finance will provide funding to nonprofit partners like Merit America and Year Up who in turn will provide services like career coaching living stipends and job placement support And we ll connect students to an employer consortium of more than companies who are looking to hire workers with these skills It s all designed around student success They will receive all of this at no upfront cost And will only pay it back once they find a job earning at least a year Social Finance will then redistribute those repayments to future learners making this model more sustainable It s another promising example of how the entire ecosystem ーfrom private companies to nonprofits ーcan work together to help more Americans access economic opportunities I m excited to see all the ways this could be transformative for people their families and their communities Thank you to our partners again for their efforts and support |
2022-02-17 20:30:00 |
海外TECH |
MakeUseOf |
Goodbye, CentOS: How to Install Rocky Linux 8 |
https://www.makeuseof.com/goodbye-centos-how-to-install-rocky-linux/
|
linux |
2022-02-17 20:45:15 |
海外TECH |
MakeUseOf |
4 Ways to Organize Your Kindle Library Using Calibre |
https://www.makeuseof.com/how-to-organize-kindle-library-using-calibre/
|
calibre |
2022-02-17 20:30:15 |
海外TECH |
MakeUseOf |
8 Reasons Why Photographers Should Start a YouTube Channel |
https://www.makeuseof.com/why-photographers-should-start-youtube-channel/
|
youtube |
2022-02-17 20:15:13 |
海外TECH |
MakeUseOf |
Your Windows 11 PC Will Soon Run Windowed Games Better |
https://www.makeuseof.com/windows-11-run-windowed-games-better/
|
windows |
2022-02-17 20:08:53 |
海外TECH |
DEV Community |
Stimulus Rails 7 Tutorial |
https://dev.to/bhumi/stimulus-rails-7-tutorial-5a6a
|
Stimulus Rails TutorialHotwire HTML over the wire ships by default in Rails Stimulus is one component of Hotwire the other one being Turbo The key promise of the Hotwire approach is to get the benefits of single page JavaScript applications like faster more fluid user interfaces without writing much Javascript certaintly not the amount needed with full fledge client side JavaScript frameworks But Javascript on the page is still needed for modern web app behaviors like show hide elements add item to a todo list etc This is where StimulusJS comes in Simulus is adverstised as a minimal Javascript framework It has been around since The latest version stimulus was released in October It is small and simple enough that if you try the examples in this post you should have good handle on how to use Stimulus in the next minutes Fun fact Stimulus use the browser s MutationObserver API to detect DOM chagnes IntroductionLet s see some code first Consider the following html lt div data controller clipboard gt PIN lt input data clipboard target source type text value readonly gt lt button data action clipboard copy gt Copy to Clipboard lt button gt lt div gt Some facts to note about Stimilus from the above code You can get an idea of what s going on by looking at the HTML alone without looking at the clipboard controller code This is different from other HTML where an external JS file applies event handlers to it Stimulus does not bother itself by creating the HTML That s still rendered on the server either on page load first hit or via Turbo or via Ajax request that changes the DOM Stimulus is concerned with manipulating the existing HTML document By adding a CSS class that hides animates highlights an element Stimulus can create new DOM elements and that s allowed But that s minority case The focus is on manipulating not creating elements How Stimulus differs from mainstream JavaScript frameworks Other frameworks are focused on turning JSON into DOM elements via template languageOther frameworks maintain state within JavaSripts objects For Stimulas state is stored in the HTML so that controllers can be discarded between page changes but still reinitialize as they were when the cached HTML appears again How Stimulus WorksStimulus is designed to enhance static or server rendered HTML by connecting JavaScript objects to elements on the page using simple annotations These JavaScript objects are called controllers and Stimulus monitors the page waiting for HTML data controller attributes to appear Each attribute s value is a controller class name Stimulus finds that class creates a new instance of that class and connects it to the element Just like class attribute is a bridge connecting HTML to CSS data controller attribute is a bridge connecting HTML to JavaScript In addition to controllers other major Stimulus concepts are actions which connect controller methods to DOM events using data action attributestargets which locate elements of significance within a controllervalues which read write observe data attributes on the controller s elementWe will see more examples of how controllers actions targets and values are used in the code below These examples are from the official Stimulus Handbook you can find the repo here Hello World in StimulusThis example prints a greeting when user clicks a button along with the name that was typed into a text box Demonstrates how actions and targets are used in the code lt body gt lt div data controller hello gt lt input data hello target name type text gt lt button data action click gt hello greet gt Greet lt button gt lt div gt lt body gt The data controller connects this HTML to a class in hello controller js file Stimulus also auto initializes this controller object The data action means when this button is clicked execute the code inside the greet method of the hello controller The value click gt hello greet is called an action descriptor If you try this you may notice that it works without the click gt part so just data action hello greet works too This is because Stimulus defines default actions for some elements i e click for a button The data controller name target is a way to connect this HTML element to the controller such that it s value can be accessed inside the controller In this case data hello target This is what the code looks like inside hello controller js import Controller from hotwired stimulus export default class extends Controller static targets name greet const element this nameTarget const name element value console log hello name We create a property for the target by adding name to our controller s list of target definitions Stimulus will automatically create a this nameTarget property which returns the first matching target element We can use this property to read the element s value and build our greeting string Building Copy to Clipboard ButtonYou know the little copy button or icon next to some text to make it easy to copy to clipboard The below code builds that functionality in Stimulus using the browser s Clipboard API The HTML looks like this lt body gt Example Copy To Clipboard lt div data controller clipboard gt PIN lt input data clipboard target source type text value readonly gt lt button data action clipboard copy gt Copy to Clipboard lt button gt lt div gt More than one instance of the clipboard controller on the page lt div data controller clipboard gt PIN lt input data clipboard target source type text value readonly gt lt button data action clipboard copy gt Copy to Clipboard lt button gt lt div gt Use other HTML elements like link and textarea instead of button and input lt div data controller clipboard gt PIN lt textarea data clipboard target source readonly gt lt textarea gt lt a href data action clipboard copy class clipboard button gt Copy to Clipboard lt a gt lt div gt lt body gt The clipboard controller js looks like this import Controller from hotwired stimulus export default class extends Controller static targets source v with a button using the browswer Clipboard API copy old navigator clipboard writeText this sourceTarget value v copy action attached to lt a gt link input from a lt textarea gt copy event event preventDefault this sourceTarget select document execCommand copy Some interesting things to learn from the above example What does the static targets line do When Stimulus loads our controller class it looks for a static array with the name targets For each target name in the array Stimulus adds three new properties to our controller For the source target name above we get these properties this sourceTarget this sourceTargets and this hasSourceTargetWe can instantiate the same controller more than once on a pageStimulus controllers are reusable Any time we want to provide a way to copy a bit of text to the clipboard all we need is the markup on the page with the right data annotations And it just works In the HTML above we have the exact same div for copying PINs duplicated twice The nd copy has a different value so we can test that both copy button work and copy the right thing The thing that s implicit here is that we have two different instances of the controller class and each instance has its own sourctTarget property with the correct value This is how we keep them separate the copy the corresponding value and don t get the values mixed up with the other input element annotated with data clipboard target source on the page It s because the controller is scoped to the lt div gt This implies that if we put two buttons inside the same lt div gt things would not work as expect The below will always copy the value in the first text box lt div data controller clipboard gt PIN lt input data clipboard target source type text value readonly gt lt button data action clipboard copy gt Copy to Clipboard lt button gt PIN lt input data clipboard target source type text value this won t get copied readonly gt lt button data action clipboard copy gt Copy to Clipboard lt button gt lt div gt Actions and Targets can go on any HTML elementsSo do we have to use a lt button gt for the copy to clipboard functionality No we could use other elements like a link lt a gt tag In which we want to make sure to preventDefatult We can also use a lt textarea gt instead of the lt input type text gt The controller only expects it to have a value property and a select method Designing for Progressive EnhancementThis is about building in support for older browsers as well as considering what happens to our application when there are network or CDN issues It may be tempting to write these things off as not important but often it s trivially easy to build features in a way that s gracefully resilient to these types of problems This approach commonly known as progressive enhancement is the practice of delivering web interfaces where the basic functionality is implemented in HTML and CSS Tiered upgrades to that base experience are layered on top with CSS and JavaScript progressively when supported by the browser With the clipboard API the idea is to hide the Copy to Clipboard button unless the browser has support for the clipboard API We do this by adding classes to the HTML adding a bit of CSS to hide the button and adding a feature check in our JavaScript controller to toggle the class to show the button if the browser supports clipboard API The HTML looks like this lt div data controller clipboard data clipboard supported class clipboard supported gt PIN lt input data clipboard target source type text value readonly gt lt button data action clipboard copy class clipboard button gt Copy to Clipboard lt button gt lt div gt And we add a connect method to the clipboard controller jsstatic classes supported connect navigator permissions query name clipboard write then result gt if result state granted this element classList add this supportedClass An issue I ran into locally on firefox with clipboard writeThis code runs happily on Chrome and does the progressive enhancement On firefox I get the error in console Uncaught in promise TypeError clipboard write value of name member of PermissionDescriptor is not a valid value for enumeration PermissionName So even the code to check whether a given browser has access to a feature in this case clipboard API itself has browser specific issues Managing State Slideshow ControllerMost JavaScript frameworks encourage you to keep state in JavaScript at all times They treat the DOM as a write only rendering target using client side templates after consuming JSON from the server Stimulus takes a different approach A Stimulus application s state lives as attributes in the DOM controllers i e the JavaScript parts are largely stateless This approach makes it possible to work with HTML from anywhereーthe initial document an Ajax request a Turbo visit or even another JavaScript library We build a slideshow controller that keeps the index of the currently selected slide in an attribute to learn how to store values as state in Stimulus Lifecycle callbacks in StimulusStimulus lifecycle callback methods are useful for setting up or tearing down associated state when our controller enters or leaves the document These methods are invoked by Stimulus initialize Once when the controller is first instantiatedconnect Anytime the controller is connected to the DOMdisconnect Anytime the controller is disconnected from the DOMUsing Values in StimulusThe concept of values is another core thing to Stimulus similar to the concept of controllers actions and targets Stimulus controllers support typed value properties which automatically map to data attributes value is a hash while targets and classes are arrays When we add a value definition to our controller class like this static values index Number Stimulus creates a this indexValue controller property associated with a data slideshow index value attribute and handles the numeric conversion for us Value change callbackIn the code below notice how we are having to manually call the this showCurrentSlide method each time we change the value in this indexValue Actually Stimulus will automatically do this for us if we add a method with this name indexValueChanged This method will be called at initialization and in response to any change to the data slideshow index value attribute including if we make changes to it in the web inspector Once we add indexValueChanged we can also remove the initialize method altogether The HTML code looks like this lt div data controller slideshow data slideshow index value gt lt button data action slideshow previous gt ← lt button gt lt button data action slideshow next gt → lt button gt lt div data slideshow target slide gt lt div gt lt div data slideshow target slide gt lt div gt lt div data slideshow target slide gt lt div gt lt div data slideshow target slide gt lt div gt lt div gt The slideshow controller js looks like this import Controller from hotwired stimulus export default class extends Controller static targets slide static values index Number initialize this showCurrentSlide next this indexValue this showCurrentSlide previous this indexValue this showCurrentSlide showCurrentSlide this slideTargets forEach element index gt element hidden index this indexValue We can use the web inspector to confirm that the controller element s data slideshow index value attribute changes as we move from one slide to the next And that the hidden attribute is added and removed from each of the slide elements as we navigate Working With External Resources HTTP Requests and TimersSometimes our controllers need to track the state of external resources where by external we mean anything that isn t in the DOM or a part of Stimulus This example build a simple email inbox where the html for new messages is loaded asychronously in the example messages html is just a static file but normally the server would return this html using fetch and then plopped into the innerHTML of the controller s div We then also use a timer to refresh and load new messages every seconds This timer is started and stopped in the life cycle methods connect and disconnect respectively The HTML placeholder looks like this annotated with Stimulus attributes lt div data controller content loader data content loader url value messages html data content loader refresh interval value gt lt div gt The content loader controller js looks like this import Controller from hotwired stimulus export default class extends Controller static values url String refreshInterval Number connect this load if this hasRefreshIntervalValue this startRefreshing disconnect this stopRefreshing load fetch this urlValue then response gt response text then html gt this element innerHTML html startRefreshing this refreshTimer setInterval gt this load this refreshIntervalValue stopRefreshing if this refreshTimer clearInterval this refreshTimer Using content loader controller on multiple elementsparamsSo far we have seen the concepts of controllers actions targets and values params is another Stimulus feature params are associated with the element and not attached at the controller level unlike values and targets i e there is not a static params in the controller Here is an example lt div data controller content loader gt lt a href data content loader url param messages html data action content loader load gt Messages lt a gt lt a href data content loader url param comments html data action content loader load gt Comments lt a gt lt div gt That url param can accessed in the controller s load action with params url like this import Controller from hotwired stimulus export default class extends Controller load params fetch params url then response gt response text then html gt this element innerHTML html What happens if you add the same data controller to nested HTML elements I made a goofy mistake of adding data controller content loader to that nd lt a gt tag above in addition to it being on the parent lt div gt already And got to see some wonderfully weird results The entire index html loaded over and over again on the page I could see the calls piling up in the network tab and the page s scroll bar getting smaller and smaller Perhaps I can think through this and use it a way to play around with the internal workings of Stimulus This specific thing was further convoluted by the fact that the above load method was done in parallel with another load method from the original example of getting inbox messages loaded with a second interval timer SummaryIn the examples above we have seen the main concepts of Stimulus controllers actions targets and values Stimulus allows us to add behavior to static or server rendered HTML by connecting JavaScript objects to elements on the page using simple annotations the data attributes on out HTML elements For more in depth posts on all things Rails Ruby and Software Development check out CodeCurious |
2022-02-17 20:33:06 |
海外TECH |
DEV Community |
Basic Google Big Query Operations with a Salesforce sync demo in MULE 4 |
https://dev.to/yucelmoran/basic-google-big-query-operations-with-a-salesforce-sync-demo-mule-4-52ak
|
Basic Google Big Query Operations with a Salesforce sync demo in MULE If we think about data storage the first think it comes to our mind is a regular database this can be any of the most popular ones like Mysql SQL server Postgres Vertica etc but I noticed no too many have interacted to one of the services Google provides with the same purpose Google Big Query And maybe it is because of the pricing but in the end many companies are moving to cloud services and this service seems to be a great fit for them In this post I would like to demonstrate in a few steps how we can make a sync job that allows us to describe a Salesforce instance and use a few objects to create a full schema of those objects tables into a Google Big Query Dataset Then with the schema created we should be able to push some data into Bigquery from Salesforce and see it in our Google Cloud Console project In order to connect to Salesforce and Google Big Query there are a few prerequisites we need Salesforce If you don t have a salesforce instance you can create a developer one here From Salesforce side you will need username password and security token you can follow this process to get it A developer instance contains a few records but if you need to have some more data this will help the process to sync that information over GCP Google Cloud Platform You can sign up here for free Google gives you for days to test the product similar to Azure Also if you already have a google account you can use it for this CREATING A NEW PROJECT IN GCP AND SETTING UP OUR SERVICE ACCOUNT KEY Once you sign up for you account on GCP you should be able to click on New Project option and write a project name in this example I choose mulesoftOnce a project is created we should be able to go to the menu in the left and we should be able to select IAM amp Admin gt Service Accounts option Now we should be able to create our service account A service account is a special type of Google account intended to represent a non human user that needs to authenticate and be authorized to access data in Google APIs Typically service accounts are used in scenarios such as Running workloads on virtual machines At the top of the page you should be able to see the option to create it then just you need to specify a Name and click on create and continue Next step is to set the permissions so for this we need to select from the roles combo BigQuery Admin Once created we should be able to select from the three dot menu on the right the option Manage KeysThen we can create a new Key in this case one as json should be enough The key will get downloaded automatically in your computer Please keep this json key somewhere you can use it later DATASET IN BIG QUERYDatasets are top level containers that are used to organize and control access to your tables and views A table or view must belong to a dataset so you need to create at least one dataset before loading data into BigQuery From the left menu we can search for BigQuery and click on it That will take us to the Bigquery console now we can click in the three dots menu and select Create dataset option Now we just need to set the name as salesforce and click on Create Dataset SETTING UP OUR MULE APPLICATION Since this is a sync job we don t need any API specification but totally can fit some scenarios where we have another application that needs to consume specific endpoints operations Let s then open our Anypoint Studio app In my case I m using mac and let s use the default template For this we are going to create five flows Sync This flow just triggers the process DescribeInstance This flow will be in charge of calling the describe operation using the Salesforce connector and provide all objects information from the Salesforce instance also will have a loop that will allow us to process the job for the objects we are going to use DescribeIndividualSalesforceObject Allows to describe an specific Salesforce object this will basically will capture the fields and field types STRING EMAIL ID REFERENCE etc and will be on charge to create a payload that BigQuery will recognize in order to get created in GBQ BigQueryCreateTable This flow only will be in charge of creating the table in BigQuery based on the Salesforce object name and the fields QuerySalesforceObject This flow dynamically will query the Salesforce object and pull the data For this we are limiting the output to records but in a bigger scale it should be done on a batch process of course InsertDataIntoBigQuery This flow will push the data over into BigQuery onlyNow let s grab our json key generated by google and copy the file under src main resources folder The key will let us authenticate against our project and execute the operations IMPORT THE GOOGLE BIG QUERY CONNECTOR From Exchange we can search Big Query and we should be able to see the connector listedthen we can just use the Add to project option and we should be able to see the operations in the Palette SYNC FLOWAs I mentioned this is only in charge of triggering the whole application so we only need one scheduler component and a flow reference to the DescribeInstance flow DESCRIBEINSTANCEThis flow will describe the whole Salesforce instance using the Describe Global operation Nextsteps on this is to use a Dataweave transform to filter to get only the Objects we are interested in so in this case I m only pulling three Accounts Contacts and a custom object called Project c I left in the transformation a few more attributes to only pull the objects that we are able to query dw import try fail from dw Runtimeoutput application java fun isDate value Any Boolean try gt value as Date successfun getDate value Any Date Null Any if isDate value value as Date as String else value payload map item index gt item mapObject value key index gt key getDate value view rawmapSalesforceReocrds hosted with by GitHubFinally you need to loop over these three objects and there s a flow reference for this sample that will call the other flows to be able to continue the process DESCRIBEINDIVIDUALSALESFORCEOBJECTThe flow basically takes the name of the Salesforce Object and will allow to describe it the connector only ask for the object name then we have a pretty interesting DW Transformation dw input payload application javaoutput application javafun validateField field if field REFERENCE or field ID or field PICKLIST or field TEXTAREA or field ADDRESS or field EMAIL or field PHONE or field URL STRING else if field DOUBLE or field CURRENCY FLOAT else if field INT INTEGER else field payload fields filter type LOCATION map fieldName name fieldType validateField type view rawSalesforce to Bigquery Fields Schema hosted with by GitHubSalesforce data types are not the same as BigQuery so we need to make a little trick to be able to create the schema in BigQuery seamless as Salesforce so in this case I ve created an small function to convert some fields like ID REFERENCE TEXTAREA PHONE ADDRESS PICKLIST EMAIL to be STRING in this case the reference or values are not really anything else than a text for DOUBLE and CURRENCY I m using the value FLOAT and finally for INT fields are changed to be INTEGERFinally because Location fields are a bit tricky and we are not able to make much with the API on them I m removing all location fields The output of this is the actual schema we will use to create the table in Google BigQuery BIGQUERYCREATETABLEThis flow allows us to create the table in BigQuery we only need to specify Table Dataset and Table Fields QUERYSALESFORCEOBJECTThis flow basically query the Object in Salesforce and then maps the data dynamically to prepare the payload for BigQuery The query basically comes from a variable salesforceFields same field we collected when we described the Object using this script payload fields filter type LOCATION map fieldName name fieldName joinBy view rawsalesforceFields hosted with by GitHubAnd finally I m limiting the result to only records Next step is to map the Salesforce result data and map it dynamically using this script dw import try fail from dw Runtimeoutput application java fun isDate value Any Boolean try gt value as Date successfun getDate value Any Date Null Any if isDate value value as Date as String else value payload map item index gt item mapObject value key index gt key getDate value view rawmapSalesforceReocrds hosted with by GitHubThanks so much to Alexandra Martinez for the insights on the utilities for DW This last script basically maps the records and uses the key as field and the value but the value needs to be replaced as Date in this case for the Strings that are date or date time So I consider this the best script in this app INSERTDATAINTOBIGQUERYThis flow just inserts the data we prepared only so basically we only need to specify table id dataset id and the Row Data SETTING UP OUR MULE APPLICATION Now we should be able to run our application and see the new tables and the data over Google big query On GCP I can see the tables I selected created And if we open any of them we should look into the schema to verify all fields are thereFinally we should be able to query the table in the console or clic on the Preview option to check the data is there I think this is kind of a common request we get on the integration space and many tweaks can be implemented if we are thinking of big migrations or setting some jobs that eventually will require tables to be created automatically from Salesforce to GCP If you like to try it I created this GitHub repository I hope this was useful and I m open to hear any enhancement scenario |
2022-02-17 20:21:56 |
海外TECH |
DEV Community |
Nudge Github PR reviewers with Slack API |
https://dev.to/strdr4605/nudge-github-pr-reviewers-with-slack-api-djb
|
Nudge Github PR reviewers with Slack APIYou know that moment when you have a Pull Request waiting for someone to review but days pass and no one is looking at it so you have to manually ask reviewers in a public channel or privately How we can automate this Previously I was using Bitbucket and Bitbucket Cloud have a cool feature to nudge reviewers from Slack and it will send a private message notifying reviewers to take a look at the Pull Request Now I am using Github with Github for Slack and it has a feature to remind reviewers at a specific time of the day about PRs But I could not find a nudge reviewer feature I searched on the Slack app store but still had no success I decided to do an MVP for my project and team not publicly available but you can create the same feature for your project team company Slack AppTo publish messages to Slack workspace you need to create a Slack App Click Create New App buttonPick From scratchSet App Name to lt Company name gt PR nudge or any other suitable name Pick your company workspacePress Create AppNext we need to create a webhook for a Slack channel In Add features and functionality section select Incoming WebhooksClick Add New Webhook to Workspace and select a channel where all messages from Github for Slack goes Usually this channel is named dev Now we have a webhook that will post messages to our channel next we need to call this webhook from somewhere UserscriptLet s build a userscript that will add a Nudge reviewers button into Github PR page that when clicked will nudge in Slack reviewers that are awaited for review If you don t know about userscripts you can check my explanation video You Don t Know user js First your need a Userscript manager extension let s install Tampermonkey After installation open Tampermonkey and create a new script Userscript HeaderWe start with UserScript name lt Your Company gt PR nudge namespace Your Company gt version description Nudge PR reviewers author You match Your Company gt pull icon domain github com grant GM xmlhttpRequest connect slack com UserScript function use strict Your code here We need match Your Company gt pull to run the script only when navigating to a PR page such as grant GM xmlhttpRequest to do requests to Slack webhook connect slack com to allow requests to Slack Nudge reviewers buttonNow let s create a Nudge reviewers button and add it to the Reviewers section function use strict const PR NUDGE WEBHOOK const nudgeButton document createElement button nudgeButton innerText Nudge reviewers nudgeButton className btn btn sm js details target d inline block float left float none m mr md js title edit button const reviewersFormElement document getElementById reviewers select menu parentElement Who needs to reviewThis function will return a list with Github nicknames that should review this PR function getAllAwaiting const allReviewers Array from reviewersFormElement querySelectorAll p const awaiting allReviewers filter reviewerEl gt reviewerEl querySelector span aria label getAttribute aria label startsWith Awaiting const awaitingNicknames awaiting map reviewerEl gt reviewerEl querySelectorAll a innerText return awaitingNicknames Mapping Github user with Slack userHere is a sad part of this script because we need to map Github with Slack to know whom to nudge on Slack At this point this is a manual process and this is the only downside of this solution function use strict const GithubSlackMap strdr U const PR NUDGE WEBHOOK To find the member ID from Slack go to a Slack user profile click More and click Copy member ID If you have engineers in your team it may take a while For me it was engineers If you have an idea how to automate this let me know onNudgeBtnClickNow let s gather all the data and sent a request to Slack webhook Slack has a specific format style for messages so to tag people we need slackMention function function slackMention githugNickname return GithubSlackMap githugNickname lt GithubSlackMap githugNickname gt githugNickname function onNudgeBtnClick e e preventDefault const prLink window location href const prTitle document title split by prLink const awaitingReviewers getAllAwaiting if awaitingReviewers length return const dataObj text Hey awaitingReviewers map slackMention nI really need your review on lt prLink prTitle gt GM xmlhttpRequest fetch true method POST url PR NUDGE WEBHOOK data JSON stringify dataObj responseType json nocache true onload response gt console log response onerror err gt console error err Add button to the pagenudgeButton addEventListener click onNudgeBtnClick setTimeout gt reviewersFormElement appendChild nudgeButton Final result UserScript name lt Your Company gt PR nudge namespace Your Company gt version description Nudge PR reviewers author You match Your Company gt pull icon domain github com grant GM xmlhttpRequest connect slack com UserScript function use strict const GithubSlackMap strdr U const PR NUDGE WEBHOOK const nudgeButton document createElement button nudgeButton innerText Nudge reviewers nudgeButton className btn btn sm js details target d inline block float left float none m mr md js title edit button const reviewersFormElement document getElementById reviewers select menu parentElement function getAllAwaiting const allReviewers Array from reviewersFormElement querySelectorAll p const awaiting allReviewers filter reviewerEl gt reviewerEl querySelector span aria label getAttribute aria label startsWith Awaiting const awaitingNicknames awaiting map reviewerEl gt reviewerEl querySelectorAll a innerText return awaitingNicknames function slackMention githugNickname return GithubSlackMap githugNickname lt GithubSlackMap githugNickname gt githugNickname function onNudgeBtnClick e e preventDefault const prLink window location href const prTitle document title split by prLink const awaitingReviewers getAllAwaiting if awaitingReviewers length return const dataObj text Hey awaitingReviewers map slackMention nI really need your review on lt prLink prTitle gt GM xmlhttpRequest fetch true method POST url PR NUDGE WEBHOOK data JSON stringify dataObj responseType json nocache true onload response gt console log response onerror err gt console error err nudgeButton addEventListener click onNudgeBtnClick setTimeout gt reviewersFormElement appendChild nudgeButton Now when clicking Nudge reviewers your teammates will be notified and will review your changes faster Final stepDon t expose webhook URL to the public Add the script to a private repo company userscripts in your Github org Name it PR nudge user js Tell teammates to install Tampermonkey and send them to add the script on their machine |
2022-02-17 20:21:44 |
海外TECH |
DEV Community |
AI plug-ins for WordPress sites |
https://dev.to/hatchith/ai-plug-ins-for-wordpress-sites-9b2
|
AI plug ins for WordPress sitesIf we are looking to incorporate AI into our website search functionality to optimize the buyer experience should we do that with a WordPress plugin or are we better off with a different approach Any other critical constructive comments are welcome as well Thx |
2022-02-17 20:21:42 |
海外TECH |
DEV Community |
Git: hide specific branch when doing git log --all |
https://dev.to/strdr4605/git-hide-specific-branch-when-doing-git-log-all-226i
|
Git hide specific branch when doing git log allI deploy on Github pages Deployment and hosting are on gh pages branch When doing my usual check of branch status and history using git log oneline all graphI faced this problem I see a very long list of commits on gh pages branch and I need to scroll down to see my other branches SolutionYou can hide a specific branch using exclude option for git log Now my command looks like this git log oneline graph exclude refs remotes origin gh pages allNow it looks better You can add this command in a git alias If needed check here for more things to exclude from git log You can check why I prefer to do long commands instead of git aliases at How I use Git |
2022-02-17 20:21:23 |
海外TECH |
DEV Community |
Authentication Spec (Fundamentals of TDD) |
https://dev.to/joesembler/authentication-spec-fundamentals-of-tdd-al1
|
Authentication Spec Fundamentals of TDD Test Driven DevelopmentTest Driven Development or TDD is a process of development where the developer first creates test cases for the software meaning what the developer wants the software to do and then developing this software by repeatedly testing the software against the predetermined test cases In most cases developers do not use TDD most developers create the software first and then test their software InspirationTDD can be a very beneficial way of developing because the developer must first write very explicitly what he or she wants the computer to do and then simply follow the error messages to develop the software In a way it is like the developer first tells the computer what he or she wants it to do then the computer tells the developer exactly the steps to developing that software It can definitely be a nice guide if you do not know exactly the steps you must complete in order to develop something Authentication SpecFor my Phase project at Flatiron I have to build a webpage with a React frontend and a Rails backend the page must feature user authentication using cookies and sessions Below I created an authentication spec in order to develop the basic login of the webpage CapybaraFirst I installed the capybara gem and added it to my Gemfile Next you want to add require capybara rails below require rspec rails in your rails helper sb file Next you have to add capybara rspec in your spec helper sb file authentication spec rbThe next step is to create a file in your app spec features called authentication spec rb Let s take a look inside this file require rails helper describe the signin process type feature do let user do User create username username password password end it signs me in do visit within log in form do fill in username with user username fill in Password with user password end click button Sign in expect page to have content Welcome end endWoah This is a lot to take in so let s break it down describe the signin process type feature doIn the line of code above we are describing what we want to test In this case we are trying to make authentication work so we are going to test the sign in process of our application let user do User create username username password password endHere we use the bang operator on let in order to create a user instance it signs me in do visit within log in form do fill in username with user username fill in Password with user password endIn the code block above we are telling the computer explicitly what we want the test to do The test is going to visit our root directory and within that root directory it is going to look for an HTML element with an ID of log in form Within this form it is going to fill in the part of the form called username with our username of our user instance we created above along with the password Finally it is going to click button Sign In The last part of our code is very important it is what the computer should expect to see in order to know that the test has passed In our case we should see some kind of greeting when we are logged in We can express this to the computer by saying expect page to have content Welcome Putting it all togetherOk so now we have our test in place let s run it Go to the console and type bundle exec run your app spec features authentication spec rb Your test should be failing Not only should it be failing but you should also get an error message saying what specifically failed Now you want to repeatedly run this test debugging each and every error message until the test passes Then you will know that your software is doing exactly as you want it to and you never even had to leave VSCode |
2022-02-17 20:16:07 |
海外TECH |
DEV Community |
AzureFunBytes Episode 70 - Intro to @Azure Stream Analytics with @fleid_bi |
https://dev.to/azure/azurefunbytes-episode-70-intro-to-azure-stream-analytics-with-fleidbi-28j9
|
AzureFunBytes Episode Intro to Azure Stream Analytics with fleid biAzureFunBytes is a weekly opportunity to learn more about the fundamentals and foundations that make up Azure It s a chance for me to understand more about what people across the Azure organization do and how they do it Every week we get together at AM Pacific on Microsoft LearnTV and learn more about Azure Personal noteThis will be my final live show of AzureFunBytes I ve really enjoyed doing this show for you every week I appreciate you being part of my journey into the services products and people that make up an amazing Azure experience Time for some new challenges This week on AzureFunBytes we ll be discussing how to best get started in stream processing with Azure Stream Analytics Azure Stream Analytics is a real time analytics service that lets you define streaming jobs in SQL Once deployed these jobs will subscribe to streaming inputs like Event Hub and filter enrich transform join or aggregate events as they come Stream Analytics offers multiple experiences for developers Small work such as demos and prototypes can be quickly addressed in the Azure Portal But most users will find that VS Code delivers the most complete developer experience for Stream Analytics via the ASA Tools extension To help get started understanding ASA jobs I ve reached out to Microsoft Senior Product Manager Florian Eiden Opening A special announcement on the future of AzureFunBytes Welcome to the show Florian So how d you get here Benefits of being an MVP Intro to Azure Stream Analytics Why near real time Stream processing ASA Canonical real time data pipeline ASA Value proposition Positioning in Azure ASA Developer Experience Demo time Let s build something Our planned agenda High level intro to stream processing what Stream Analytics doesThe Azure Stream Analytics service when to use it vs Functions or Data FactoryThe local development experience in VS Code writing a query debugging and testing Local development of Azure Stream Analytics jobsThere are many benefits to developing Stream Analytics jobs locally in VS Code It s easier to get started As we support fully local job runs with sample mock input files there is no need to set up an input event hub and find a way to produce events In this case there is no need to have an Azure subscription and development costs are non existent And if a streaming source is available we also support local runs on live inputs The developer feedback loop is much shorter with local development Running a query only takes a few seconds Debugging a query is accelerated Stream processing is not an easy domain to grasp having quick feedback on what works or not on a query is a must In addition we offer unit testing via our npm cicd package the best way to make sure a query will behave properly once deployed Not to mention the obvious local development unlocks source control and greatly facilitates CI CD So be part and let s learn about Azure Stream Analytics jobs together About Florian Eiden Florian is a Senior Product Manager on the Azure Stream Analytics team He s responsible for the SQL language used in Stream Analytics He spent much of his career moving data in batches and is now trying to speed things up Born in France now enjoying the Canadian Pacific Northwest with his family Learn about Azure fundamentals with me Live stream is normally found on Twitch YouTube and LearnTV at AM PT PM ET Thursday You can also find the recordings here as well AzureFunBytes on TwitchAzureFunBytes on YouTubeAzure DevOps YouTube ChannelFollow AzureFunBytes on TwitterUseful docs Get in free Azure CreditMicrosoft Learn Introduction to Azure fundamentals Introduction to Azure Stream Analytics Microsoft DocsQuickstart Create an Azure Stream Analytics job in Visual Studio Code Microsoft DocsDevelop Azure Stream Analytics queries locally with Visual Studio Code Microsoft DocsTest an Azure Stream Analytics job locally with sample data using Visual Studio Code Microsoft DocsInput validation for better Azure Stream Analytics job resiliency Microsoft DocsUnderstand time handling in Azure Stream Analytics Microsoft DocsStream Analytics Query Language Reference Stream Analytics Query Microsoft Docs |
2022-02-17 20:14:53 |
海外TECH |
DEV Community |
N-Cus-Weather with Azure |
https://dev.to/varanasiroshan2001/n-cus-weather-with-azure-3e3b
|
N Cus Weather with Azure Overview of My ProjectDevelopment of a fully fledged hardware to software weather station app integration with azure Problem Due to the most of the incoming services on weather people are not able to see and take accurate measurements on custom area Solution using Azure So I have developed a hardware integration where we installed a weather station physically on my house and with attachment of all types of sensors like UV Sensor Lux sensor rain sensor temperature pressure humidity air pollution levels sensors and many more I integrated all those hardware devices with one single unit with Arduino and with the help of IoT then wrote the website code with JavaScript and CSS and using NodeJS and react I used Azure app services to load them with it and Azure DevOps for cont monitoring and seeing them through CI CD pipelines Then also used Cosmo DB for all weather database storing in key pair format and generating report and doing it further on The landing page gives this scenesThe temperature record here exactly recorded from sensors and data is kept at Cosmo DB is shown belowThe this week tab shows us the week data which is calculated from the last weeks data and predictions used in our algorithm showing the approx temperatures of next weekThe todays highlights part shows us the details of the present day where the data is shown and parameters are there which comes from all hardware sensors that we have connected and integrated with Arduino When we click the each tile it pops up and gives all detailed detail of data with what it is and its latest records of data from Azure Cosmo DB Here with the time stamp the records are shown here you can see the latest data has been taken presently today that is today itself at and the average of the data amongst them is shown at the side The azure app service where we deployed this whole infrastructure and with our website code looks something like this Then all this is kept in continuous development of the code to innovate more so we have kept all the things in sync with the GitHub triggers DevOps CI CD pipelines so that new code when committed it directly launches to production after our review So more automation is there The physical hardware setup of sensors and weather station is built with full working conditions The Arduino code setup file with all the integrated sensors is attached in GitHub repo also Submission Category Computing CaptainsAzure Services Involved gt Azure App Service Azure Static Web Apps Azure DevOps Azure CosmoDB Link to Code on GitHubWeather Github Additional Resources InfoComplete Video Demo of the project YouTube Demo |
2022-02-17 20:10:41 |
海外TECH |
DEV Community |
What is wrong with SOURCE MAPS and how not to mess with them? |
https://dev.to/ninjin/what-is-wrong-with-source-maps-and-how-not-to-mess-with-them-5902
|
What is wrong with SOURCE MAPS and how not to mess with them Hello my name is Dmitry Karlovskiy and I have post traumatic stress disorder after generating sourcemaps And today with your help we will treat this by immersing ourselves as deeply as possible in traumatic events This is a text transcript of the speech at HolyJS You can watch video record read as article or open in presentation interface How did I get to this point First the medical history tree formatLanguage view treeFramework molI once developed a simple Tree format to represent abstract syntax trees in the most visual form Based on this format I have already implemented several languages One of them the view tree language is intended for declarative description of components and their composition with each other And it is in this language that all the standard visual components of the mol framework are described This allows you to write short and descriptive code that does a lot of useful things Why DSL Boilerplate Now you see the completed application on mol my app my page title Are you ready for SPAM body lt Agree my checkbox checked val lt gt agree val trueIt consists of a panel with a checkbox inside And together they are connected by two way communication according to the given properties These lines of code even have localization support The equivalent JavaScript code takes up times more space class my app extends my page title return this my text my app title body return this Agree Agree const obj new this my checkbox obj checked val gt this agree val return obj agree val true return value my mem my app prototype agree my mem my app prototype Agree This code although in a more familiar language is much more difficult to understand In addition he completely lost the hierarchy in order to achieve the same level of flexibility The good thing about a flat class is that you can inherit from it and override any aspect of the component s behavior Thus one of the main reasons for using DSL is the ability to write simple and concise code that is easy to learn hard to mess up and easy to maintain Why DSL Custom Scripts Another reason for implementing DSLs is the need to let the users themselves extend your application logic using scripts For example let s take a simple task list automation script written by a normal user assignee me component frontend estimate D deadline prev deadline estimateHere he says put me in charge of all tasks indicate that they are all related to the frontend if the estimate is not set then write day and build their deadlines one by one taking into account the resulting estimate JS in a sandbox Is it legal And here you may ask why not just give the user JS in their hands And then I suddenly agree with you I even have a sandbox for safely executing custom JS And the online sandbox for the sandbox sandbox js hyoo ruYou can try to get out of it My favorite example Function is not a function in very spirit of JS JS in a sandbox No it s not for average minds However for the average user JS is too complicated It would be much easier for him to learn some simple language focused on his business area rather than a general purpose language like JS Why DSL Different Targets Another reason to create your own DSL is the ability to write code once and execute it in a variety of runtimes JSWASMGPUJVMCIL Why different targets One model to rule them all As an illustration I will give an example from one startup that I developed For half a year of development we have done quite a lot And all thanks to the fact that we had a universal isomorphic API which was configured by a simple DSL which described what entities we have what attributes they have what types they have how they are related to other entities what indexes they have and all that Just a few dozen entities and under a hundred connections A simple example is the task model task title String estimate DurationFrom this declarative description which occupies several kilobytes code is already generated that works both on the server and on the client and of course the database schema is also automatically updated class Task extends Model title return this json title estimate return new Duration this json estimate my mem Task prototype estimate CREATE CLASS Task extends Model CREATE PROPERTY title string CREATE PROPERTY estimate string Thus development and especially refactoring is significantly accelerated It is enough to change the line in the config and after a few seconds we can already pull the new entity on the client Why DSL Fatal flaw And of course what kind of programmer does not like fast cycling Why all this Transpilation and checks So we have a lot of different useful tools Babel and other transpilers Uglify and other minifiers TypeScript AssemblyScript and other programming languages TypeScript FlowJS Hegel and other typecheckers SCSS Less Stylus PostCSS and other CSS generators SVGO CSSO and other optimizers JSX Pug Handlebars and other templaters MD TeX and other markup languages ESLint and other linters Pretier and other formatters Developing them is not an easy task Yes even to write a plugin for any of them you have to take a steam bath So let s think about how all this could be simplified But first let s look at the problems that lie in wait for us on the way So what s the problem This is not what I wrote Let s say a user has written such a simple markdown template Hello World And we generated a spreading code that collects the DOM through JS function make dom parent const child document createTextNode Hello parent appendChild child constchild document createElement strong void parent gt const child document createTextNode World parent appendChild child child parent appendChild child const child document createTextNode parent appendChild child If the user encounters it for example when debugging then it will take him a long time to understand what kind of noodle code is and what he does in general So what s the problem Yes the devil will break his leg It is quite sad when the code is not just bloated but also minified with single letter variable and function names Hello World function make dom e const t document createTextNode Hello e appendChild t const t document createElement strong e gt const t document createTextNode World e appendChild t t e appendChild t const t document createTextNode e appendChild t How can sourcemaps help Sources and Debugging But this is where sourcemaps come to the rescue They allow instead of the generated code to show the programmer the code that he wrote Moreover debugging tools will work with sourcemaps it will be possible to execute it step by step set breakpoints inside the line and so on Almost native How can sourcemaps help Stack traces In addition sourcemaps are used to display stack traces The browser first shows links to the generated code downloading sourcemaps in the background after which it replaces links to the source code on the fly How can sourcemaps help Variable values The third hypostasis of sourcemaps is the display of the values of variables In the source example the name next is used but there is no such variable in runtime because in the generated code the variable is called pipe However when hovering over next the browser does a reverse mapping and displays the value of the pipe variable Specification No have not heard It is intuitively expected that sourcemaps should have a detailed specification that can be implemented and that s it we re in chocolate This thing is already years old However things are not so rosy V Internal Closure Inspector formatProposal V JSON Proposal V Speca has versions I did not find the first one and the rest are just notes in Google Docs The whole history of sourcemaps is the story of how a programmer making developer tools heroically fought to reduce their size In total they decreased as a result by about This is not only a rather ridiculous figure in itself but also the struggle for the size of sourcemaps is a rather pointless exercise because they are downloaded only on the developer s machine and then only when he is debugging That is we get the classic misfortune of many programmers optimizing not what is important but what is interesting or easier to optimize Never do that How to sort out the sorsmaps If you decide to contact the sourcemaps then the following articles may be useful to you Introduction to JavaScript Source MapsSource Maps fast and clearNext I will tell you about the underwater rake which is abundantly scattered here and there in the name of reducing the size How are sourcesmaps connected Sourcemaps can be connected in two ways It can be done via HTTP header SourceMap lt url gt But this is a rather stupid option as it requires special configuration of the web server Not every static hosting allows this at all It is preferable to use another way placing a link at the end of the generated code sourceMappingURL lt url js map gt sourceMappingURL lt url css map gt As you can see we have a separate syntax for JS and a separate syntax for CSS At the same time the second option is syntactically correct for JS but no it won t work that way Because of this we cannot get by with one universal function for generating code with sourcemaps We definitely need a separate function for generating JS code and a separate one for CSS Here is such a complication out of the blue How do sourcemaps work Let s see what they have inside version sources url url sourcesContent src src names va var mappings AAAA ACCO AAAA ADJH WFCIG ADJI The sources field contains links to sources There can be any strings but usually these are relative links according to which the browser will download the sources But I recommend that you always put these sources in sourcesContent this will save you from the problems that at some point you will have one version of the mappings and the other sources or not download at all And then happy debugging Yes sourcemaps bloat in size but this is a much more reliable solution which is important when debugging already buggy code We get that all that struggle for the size of sourcemaps was meaningless since a good half of the sourcemap is source codes The names field stores the runtime variable names This crutch is no longer needed since now browsers can do both forward and reverse mapping That is they themselves pull out the names of the variables from the generated code Well in the mappings field there are already in fact mappings for the generated code How to decode mappings Let s imagine mappings for clarity in several lines in order to understand their structure AAAA ACCO AAAA ADJH WFCIG ADJI For each line of the generated file several spans are specified separated by commas And at the end a semicolon to separate lines Here we have semicolons so there are at least lines in the generated file It is important to emphasize that although a semicolon can be trailing commas cannot be trailing Well more precisely FF eats them and will not choke but Chrome will simply ignore such sourcemaps without any error message What kind of spans are these Span is a set of numbers in the amount of or pieces Span points to a specific place in a specific source The fifth number is the number of the variable name in the names list which as we have already found out is not needed so we simply do not specify this number So what s in these numbers The remaining numbers are the column number in the corresponding line of the generated file the source number the source line number and the column number in this line Keep in mind that numbers start from The last three numbers can be omitted then we will only have a pointer to a column in the generated file which is not mapped anywhere in the source A little later I will tell you why this is necessary In the meantime let s figure out how numbers are encoded And it s all in bytes Differential coding It would be naive to serialize spans like this each row is one span TCSISRSCthirtytwentyBut in sourcemaps differential encoding is used That is the field values are presented as is only for the first span For the rest it is not the absolute value that is stored but the relative value the difference between the current and previous span TCSISRSCthirty twenty Please note that if you add to from the first span you get for the second span and if you add more then for the third span The same amount of information is stored in this representation but the dimension of the numbers is somewhat reduced they become closer to And it s all in bytes VLQ encoding Next VLQ encoding or variable length encoding is applied The closer a number is to the fewer bytes it needs to represent values Bit CountBytes Count one As you can see every significant bits of information require additional byte This is not the most efficient way to encode For example WebAssembly uses LEB where a byte is already spent for every significant bits But this is a binary format And here we have mappings for some reason made in JSON format which is text In general the format was overcomplicated but the size was not really won Well okay it s still flowers How good are the sourcemaps If there was a source Sourcemaps map not a range of bytes in one file to a range in another as a naive programmer might think They only map dots And everything that falls between the mapped point and the next one in one file it seems to be mapped to everything after the corresponding point to the next one in another file And this of course leads to various problems For example if we add some content that is not in the source code and accordingly we don t map it anywhere then it will simply stick to the previous pointer In the example we have added Bar And if we do not prescribe any mapping for it and there is nowhere to map it then it will stick to Foo It turns out that Foo is mapped to FooBar and for example displaying the values of variables on hover stops working To prevent this from happening you need to map Bar to nowhere To do this just need a variant of the span with a single number In this case it will be the number since Bar starts from the third column Thus we say that after the given pointer until the next or the end of the line the content is not mapped anywhere and Foo is mapped only on Foo How good are the sourcemaps There would be a result There is also an opposite situation when there is content in the source but it does not go to the result And here too there can be a problem with adhesion It turns out that you need to map the cut content somewhere But where The only place is somewhere at the end of the resulting file This is quite a working solution And everything would be fine but if our pipeline does not end there and processing continues then there may be problems For example if we next glue several generated files together then we need to merge their mappings They are arranged in such a way that they can be simply concatenated However the end of one file becomes the beginning of the next And everything will fall apart And if you need to glue the sourcemaps It would be possible to do tricky remapping when concatenating but here another sourcemap format comes to our aid Here is the tweet There are actually two of them Composite sourcemaps already look like this version sections offset line column url url for part map offset line column map Here the generated file is divided into sections For each section the initial position is set and either a link to a regular sourcemap or the content of the sourcemap itself for this section And pay attention to the fact that the beginning of the section is set in the line column format which is extremely inconvenient Indeed in order to measure a section it is necessary to recalculate all newlines in the previous sections Such jokes would look especially fun when generating binary files Fortunately sourcemaps by design do not support them What about macros Map on their insides Another extreme case is macros in one form or another That is code generation at the application level Take for example the log macro which takes some expression and wraps it in conditional logging template log value if logLevel gt Info A console log value B log stat log stat log stat Thus we do not evaluate a potentially heavy expression if logging is turned off but at the same time we do not write a bunch of the same type of code Attention the question is where to map the code generated by the macro if logLevel gt Info A console log stat B if logLevel gt Info A console log stat B if logLevel gt Info A console log stat B If we map it to the contents of the macro then it turns out that when executing the code step by step we will walk inside the macro ABABAB And we will not stop at the point of its application That is the developer will not be able to see where he got into the macro from and what was passed to him What about macros Let s look at their use Then maybe it s better to map all the generated code to the place where the macro is applied template log value if logLevel gt Info console log value log stat log stat log stat if logLevel gt Info console log stat if logLevel gt Info console log stat if logLevel gt Info console log stat But here we get a different problem we stopped at line then again at line then again This can go on for a tediously long time depending on how many instructions will be inside the macro In short now the debugger will stop several times at the same place without entering the macro code This is already inconvenient plus debugging the macros themselves in this way is simply not realistic What about macros Mapim and on application and on vnutryanku With macros it is better to combine both approaches First add an instruction that does nothing useful but maps to the place where the macro is applied and the code generated by the macro is already mapped to the macro code template log value if logLevel gt Info A console log value B log stat log stat log stat void if logLevel gt Info A console log stat B void if logLevel gt Info A console log stat B void if logLevel gt Info A console log stat B Thus when debugging step by step we will first stop at the place where the macro is applied then we will go into it and go through its code then we will exit and move on Almost like with native functions only without the ability to jump over them because the runtime knows nothing about our macros It would be nice to add support for macros in version of sourcemaps Oh dreams dreams How good are the sourcemaps If it wasn t for the variable names Well regarding variables everything is also pretty dull here If you think you can isolate an arbitrary expression in the source and expect the browser to look at what it maps to and try to execute it then no matter how Only variable names no expressions Just a complete coincidence How good are the sourcemaps If not for evil And one more devil in implementation details If you are generating code not on the server but on the client then in order to execute it you will need some form of invocation of the interpreter If you use eval for this then mappings will be fine but will be slow It is much faster to make a function and execute it many times already new Function debugger But the browser under the hood does something like eval function anonymous debugger That is it adds two lines to your code from above which is why all the mappings turn the wrong way To overcome this you need to move the sourcemaps down for example by adding a couple of semicolons to the beginning of the mapping Then new Function will map well But now it will move out to eval That is when you generate mappings you must clearly understand how you will run this code otherwise the mappings will show the wrong way How good are the sourcemaps But something went wrong Well the main trouble with sourcesmaps if you mess up somewhere then in most cases the browser will not tell you anything but simply ignore it And then you just have to guess Tarot cardsNatal chartsGoogle MapsAnd even Google is of little help here because there are mostly answers to questions in the spirit of how to set up WebPack And there is only one reasonable setting option Why users were given so many grenades is not clear Let s fantasize Sourcemaps of a healthy person Okay with sorsmaps now everything is rather sad Let s try to design them now from scratch I would make a binary format for this where not pointers but specific ranges of bytes would be mapped We will allocate constant bytes for the span that is a machine word Working with it is simple fast and most importantly it is enough for our needs The span will consist of numbers the offset of the range in the cumulative source concatenation of all sources the length of this range and the length of the range as a result fieldBytes Countsource offsetsource lengthtarget lengthThis information is necessary and sufficient to uniquely map the source to the result Even if the result is a binary not text And even if we need to remake something somewhere then this is done by a simple and effective function But unfortunately we have to work with what we have now Is it worth messing with sourcemaps I hope I managed to show that sourcemaps are another swamp in which it is better not to get into In the process of transformation they must be carefully monitored so that they do not get lost and move out Error messages must point to the source and in the case of macros you need to display a trace according to the source Total Difficult in itself Carry through transformations Carry in error messages Plus trace on templates I wouldn t want to mess with them but I had to But let s think about how to avoid them Difficult Let s take Babel Take a popular tool like Babel Surely all the problems there have already been resolved and you can sit down and go Let s take the first available plugin import declare from babel helper plugin utils import type NodePath from babel traverse export default declare api options gt const spec options return name transform arrow functions visitor ArrowFunctionExpression path NodePath lt BabelNodeArrowFunctionExpression gt if path isArrowFunctionExpression return path arrowFunctionToExpression Babel Helper allowInsertArrow false specCompliant spec It transforms an arrow function into a regular one The task seems to be simple and there is not so much code However if you look closely then all this footcloth does is call the standard Babel helper and that s it A bit too much code for such a simple task Babel why so many boilerplates Okay let s take a look at this helper import babel types import nameFunction from babel helper function name this replaceWith callExpression mapped to this memberExpression mapped to this nameFunction this true this node mapped to this identifier bind mapped to this checkBinding identifier checkBinding name thisExpression Yup new AST nodes are generated here using global factory functions But the trouble is that you have no control over where they are mapped And a little earlier I showed how important it is to precisely control what maps where This information is not immediately available so Babel has no choice but to map new nodes to the only node to which the plugin has matched this which does not always give an adequate result Shall we debug AST smoker The next problem is debugging transformations Here it is important for us to be able to see which AST was before the transformation and which was after Let s take a simple JS code const foo bar Just look at what a typical abstract syntax tree AST looks like for him type Program sourceType script body type VariableDeclaration kind const declarations type VariableDeclarator id type Identifier name foo init type ObjectExpression properties And this is only half of it And this is not even a Babel AST but some kind of noun I just took the most compact of those that are on ASTExplorer Actually that s why this tool appeared in general because without it looking at these JSON chiki is pain and suffering Shall we debug AST of a healthy person And here the Tree format comes to our aid which I once developed specifically for the purpose of visual representation of AST const foo bar const foo bar As you can see the js tree representation is already much cleaner And does not require any ASTExplorer Although I made a tree support patch for it which has been ignored by the maintainer for the second year It s open source baby And how to work with it Everything you need and nothing you don t In my Tree API implementation mol tree each node has only properties type name raw value list of children nodes and span pointer to the range in the source interface mol tree readonly type string readonly value string readonly kids mol tree readonly span mol span Each span contains a link to the source the contents of the source itself the row and column numbers of the beginning of the range and the length of this range interface mol span readonly uri string readonly source string readonly row number readonly col number readonly length number As you can see there is everything you need to represent and process any language and nothing unnecessary And how to work with it Local factories New nodes are generated not by global factory functions but on the contrary by local factory methods interface mol tree struct type kids mol tree data value kids mol tree list kids mol tree clone kids mol tree Each such factory creates a new node but inherits the span from the existing node Why does this work In this way we can precisely control which part of the source each node will map to even after applying many AST transformations In the diagram you can see how we generated from files through transformations which cut something added something and mixed something But the binding to the source codes has not been lost anywhere And how to work with it Generalized transformations There are generic methods for writing transformations interface mol tree select path mol tree filter path value mol tree insert path value mol tree hack belt context mol tree Each of them creates a new AST without changing the existing one which is very convenient for debugging They allow deep fetches deep fetch filtering deep inserts and hacks What kind of hacks are these template example Haki is the most powerful thing that allows you to walk through the tree replacing nodes of different types with the result of executing different handlers The easiest way to demonstrate their work is to implement a trivial templating engine for AST as an example Let s say we have a config template for our server rest api login username password passworddb root user username secret passwordAfter parsing it into AST we can hack our config in just a few lines of code config list config hack username n gt n data jin password p gt p data password As a result it turns out that all placeholders are replaced with the values we need rest api login jin password passworddb root user jin secret password What if something more complicated Automation script Let s consider a more complicated example an automation script click my app Root Task click my app Root Details TrackTime Here we have the click command It is passed the ID of the element to be clicked on Well let s get this script so that the output is javascript AST script hack click click belt gt const id click kids return click struct id struct document id struct id data getElementById id struct id click struct click data click click struct Note that some of the nodes are created from the command name click and some of the nodes are created from the element identifier id That is the debugger will stop here and there And the error stack traces will point to the correct places in the source code Is it even easier jack tree macro language for transformations But you can dive even deeper and make a DSL to handle the DSL For example the transformation of an automation script can be described as follows in jack tree language hack script fromhack click document getElementById data from click script jack click my app Root Task click my app Root Details TrackTime Each hack is a macro that matches a given node type and replaces it with something else It s still a prototype but it already does a lot of things And if different targets Transform to JS cutting out the localization Hacks allow you to do more than just literally translate one language into another With their help it is possible to extract information of interest to us from the code For example we have a script in some simple DSL that outputs something in English js print begin Hello World when onunload print end Bye World And we can convert it to JS so that instead of English texts the localize function with the desired key twitches by simply wrapping it in a macro js console log localize begin function onunload console log localize end And if different targets Isolate translations ignoring logic But we can apply another macro to it loc loc print begin Hello World when onunload print end Bye World And then on the contrary all logic will be ignored and we will get JSON with all the keys and their corresponding texts begin Hello World end Bye World And if different targets We change transformations like gloves On jack tree these macros are described by relatively simple code hack js hack print console log from hack localize type from hack when function struct type from kids from fromhack loc hack print from hack when kids from hack type from kids from fromAs you can see other macros can be declared inside a macro That is the language can be easily extended by means of the language itself Thus it is possible to generate different code You can take into account the context in which the nodes are located and match only in this context In short the technique is very simple but powerful and at the same time nimble since we do not have to walk up and down the tree we only go down it Something went wrong Trace of transformations Great power requires great responsibility If something goes wrong and an exception occurs and we have a macro on a macro and a macro drives then it is extremely important to output a trace which will help you figure out who matched what where on the way to the place of the error Here we see that an exception occurred at point but a mistake was made by a person at point to which we came from point Well why another bike And here you are most likely wondering Dima why have another bike Don t boil the pot Enough bicycles already I would be happy but let s briefly compare it with the alternatives BabeltypescripttreeAPI complexity∞Abstraction from languageAPI immutabilityConvenient serializationSelf sufficiencyBabel has about functions methods and properties TS has some kind of prohibitive complexity there and almost no documentation All of them are nailed to JS which complicates their use for custom languages They have a mutable API without concise AST serialization which greatly complicates debugging Finally Babel s AST is not self sufficient that is we cannot directly generate both the resulting script and sourcemaps from it for this we need to extend the source codes in a roundabout way With error messages the same trouble TS is better at this but here it is already another extreme together with a banana he gives you both a monkey and a jungle and even his own solar system in several guises Typical pipeline something is wrong here Let s take a look at what a typical front end pipeline looks like TS parsed transpiled serialized Webpack parsed shook trees assembled serialized Terser parsed minified serialized ESLint parsed checked everything serialized Something is wrong here All these tools lack a single language for communication that is some representation of AST which would be on the one hand as simple and abstract as possible and on the other hand would allow expressing everything necessary for each tool but not would be tied to him And in my opinion the Tree format is the best for this Therefore in the future it would be great to push the idea to them to switch to this format But unfortunately I m not an influencer enough for that So let s not roll out the lip much but let s dream a little What would a healthy person s pipeline look like Parsed in AST Everything was transformed and checked Serialized to scripts styles and sourcemaps Thus the main work takes place at the AST level without intermediate serializations And even if we need to temporarily serialize the AST in order for example to transfer it to another process then a compact Tree can be serialized and parsed much faster of sprawling JSON How to avoid result and sourcemap travel text tree Ok we have transformed the AST it remains to serialize it If this is done for each language separately then it will be too difficult because another language can have dozens or even hundreds of types of nodes And each one needs not only to be serialized correctly but also to correctly form a span for it in the mapping To make this easier Stefan and I developed the text tree where there are only types of nodes lines indents and raw text Simple example line indent line foo line foo sourceMappingURL data application json B version A C sources A B unknown D C sourcesContent A B line C C B Cnindent Cn Ctline Cn Ct Ct C Cfoo Cn Ct Ct C C A Cn Ct Ct C C Cnline C C D Cn D C mappings A B BAAAA CAAAK BAACL CAACC CCACC CGACA CEACA BAACF CAAAK B DAny other language can be transformed into text tree relatively easily without any span dances And further serialization with the formation of sourcesmaps is just the use of standard already written functions What if you need WebAssembly wasm tree gt bin treeWell in addition to text serialization we also have binary serialization Everything is the same here we transform any language into bin tree after which we get a binary from it with a standard function For example let s take a non cunning wasm tree code custom xxxtype xxx gt i gt i gt f lt fimport foo bar func xxxAnd now let s run it through the wasm tree compiler and get bin tree which we immediately convert to a binary and validate the WASM runtime D You can write code both directly on wasm tree and on any of your DSLs which has already been transformed into wasm tree Thus you can easily write under WebAssembly without diving into the wilds of its bytecode Well when I finish this compiler of course If someone is ready to help join Even WASM with sourcemapping And of course we automatically get sourcemaps from bin tree as well It s just that they won t work For WASM you need to generate an older mapping format that is used for compiled programming languages DWARFBut I m still afraid to climb into these jungles Forgotten something So far we ve only talked about generating code from our DSL But for comfortable work with it many more things are required Syntax highlightingHintsChecksRefactorings One extension to rule them all Come on I have a wild idea for each IDE make one universal plugin that can read the declarative description of the language syntax and use it to provide basic integration with the IDE highlighting hints validation I have so far implemented highlighting There is a three minute video of the process of describing a new language for your project on the mol channel You do not need to restart anything install developer builds of the development environment or special extensions for them You just write code and it repaints your DSL in real time On the right you see the code in the language view tree and on the left a description of this language The plugin does not know anything about this language but thanks to the description it knows how to colorize it What do you need for automatic highlighting It works simply upon encountering an unfamiliar language determined by the file extension the plugin scans the workspace for the existence of a schema for this language If it finds several schemes it connects them There is also a requirement for the language itself the semantics of its nodes must be specified syntactically For example starts with a dollar or has the name null That is there should not be syntactically indistinguishable nodes that have different semantics This however is useful not only for highlighting but also for simplifying the understanding of the language by the user himself Total what you need Declarative description of the language Syntactic binding to semantics No installation for each language Default heuristics Yes the description of the language is not at all necessary because sometimes the default heuristics are enough for coloring any tree based languages Where to go This is where my story ends I hope I managed to interest you in my research And if so you might find the following links helpful nin jin github io slides sourcemap these slidestree hyoo ru sandbox for tree transformations jin nin JS tweetsThank you for listening I felt better Witness s testimoniesAt the beginning it was a bit difficult to focus on the problem It s complicated and it s not clear where to apply it I still don t understand why this report is needed at this conference the topic seems to have been revealed but the design of DSL is somewhat strange practical applicability The name does not match the declared even minimally information about the sourcemap goes from minutes to the th the rest of the time the author broadcasts about his framework which has nothing to do with the topic I wasted my time it would be better to look at another author Cool theme and Dima even almost got rid of professional deformation with mol Interesting report Dmitry spoke very well about the subject area highlighted possible problems and thought about ease of use for the user Very cool |
2022-02-17 20:09:31 |
海外TECH |
DEV Community |
14 Codepens to Blow You Away! |
https://dev.to/code_jedi/14-codepens-to-blow-you-away-2m02
|
Codepens to Blow You Away Have you ever come across some CSS or Javascript code that blew you away through it s elegance and functionality No Well there s a first time for everything Plus some of these codepens are excellent learning material for Web Designers and Web Developers Blooming flowers in pure CSS A D gaming room made from pure CSS A pure CSS interactive D stopwatch Newton s lightbulbs loader A D isometric color picker A neon heart animation A self destruct button A tilting maze game A simple chess AI Stairs on wall with pure CSS Holographic pokemon cards Turning page effect in pure CSS A double helix in pure CSS A wordle gameThat s it for this compilation Hope you enjoyed |
2022-02-17 20:03:02 |
海外TECH |
DEV Community |
Online Dating After COVID-19 |
https://dev.to/marthaeclark/online-dating-after-covid-19-4500
|
Online Dating After COVID Online dating after COVID was a big hit for singles last year After the viral outbreak popular dating apps made modifications to accommodate the new social distancing measures Now singles can swipe right and meet someone in real life Gone is the days of the hookup culture and dating apps have adapted Here are some of the changes that have taken place This article focuses on the changes made by these companies Although the CDC has issued new guidelines for quarantine states are now beginning to loosen their requirements While there is no official recommendation for re entering the dating scene after a COVID outbreak many experts believe the rules will remain in place during the pandemic For now it is safe to resume online dating but you ll want to take the time to follow all precautions and follow CDC guidelines While the new public health guidelines may be discouraging it s important to remember that online dating can be an effective solution for singles In fact in March Tinder users swiped over three billion times on a single day In May Callmechat reported a increase in video calls and dates The data is proof of the effectiveness of online dating apps These apps also allow people to continue to meet and date while following the guidelines set forth by the CDC Online dating after COVID can be a difficult proposition but it s not impossible Even though you re likely to meet someone in person it s important to practice proper safety precautions Fortunately there are plenty of online dating sites that can help you find the right partner As of March the most popular dating app Tinder recorded three billion swipes in one day The numbers were even higher on Callmechat and Bumble with more dates during that period Dating after COVID is possible and online dating apps have helped singles meet After the outbreak video dates and socially distant meetups became the norm And even if you re not able to meet in person online dating has become a good alternative Unlike the CDC s guidelines online dating apps are a great way to meet singles from different parts of the world In addition to the public health guidelines the outbreak has also affected the dating process While traditional meetups have become increasingly difficult online dating apps have been helping singles find dates In March Tinder saw three billion swipes in a single day In May the number of video calls on OkCupid and Bumble reached billion However online dating after COVID has been an excellent way for singles to find love |
2022-02-17 20:02:18 |
Apple |
AppleInsider - Frontpage News |
How to cancel an App Store subscription on iOS 15 or iPadOS 15 |
https://appleinsider.com/articles/21/11/05/how-to-cancel-an-app-store-subscription-on-ios-15-or-ipados-15?utm_medium=rss
|
How to cancel an App Store subscription on iOS or iPadOS If you re suffering from subscription fatigue it s easy to cancel your unwanted app subscriptions right from your iPhone or your iPad Maybe you ve forgotten to unsubscribe from an app that s been billing you every month for the last year Or perhaps you re trying to avoid being billed before a free trial is over Either way it s not a bad time to learn how to cancel those pesky unneeded App Store subscriptions AppleInsider suggests that you routinely check your subscriptions to prevent any unwanted billing ーonce every six months is a good routine to get into Read more |
2022-02-17 20:16:12 |
Apple |
AppleInsider - Frontpage News |
Apple announces "The Big Conn" true crime docuseries and podcast |
https://appleinsider.com/articles/22/02/17/apple-announces-the-big-conn-true-crime-docuseries-and-podcast?utm_medium=rss
|
Apple announces quot The Big Conn quot true crime docuseries and podcastThe four part true crime docuseries is set to arrive on Apple TV in May featuring a companion podcast launching exclusively on Apple Podcasts The Big Conn will tell the story of former eastern Kentucky attorney Eric C Conn Conn is known for defrauding the government for more than half a billion dollars in the largest Social Security fraud case in history A companion podcast will launch on Apple Podcasts delving into Conn s outrageous lifestyle The podcast will also feature exclusive interviews and behind the scenes details Read more |
2022-02-17 20:56:53 |
海外TECH |
Engadget |
YouTube could ‘break’ sharing on borderline content to fight misinformation |
https://www.engadget.com/youtube-could-break-sharing-on-borderline-content-to-fight-misinformation-201819354.html?src=rss
|
YouTube could break sharing on borderline content to fight misinformationYouTube is eyeing new measures to tackle misinformation on its platform Among the changes being considered according to Chief Product Officer Neal Mohan are updates that would effectively “break sharing features for videos with “borderline content The change would be a major shift for the platform though it s not clear if the company will actually take such a step Mohan described the possibility in a lengthy blog post outlining the company s approach to preventing misinformation from going viral In the post he noted that so called borderline content ー“videos that don t quite cross the line of our policies for removal but that we don t necessarily want to recommend to people ーcan be particularly challenging to deal with That s because YouTube aims to remove these videos from its recommendations but they can still spread widely when shared on other platforms “One possible way to address this is to disable the share button or break the link on videos that we re already limiting in recommendations he wrote “That effectively means you couldn t embed or link to a borderline video on another site Mohan added that the company was still wrestling with whether or not it should take this more aggressive approach “We grapple with whether preventing shares may go too far in restricting a viewer s freedoms He said an alternative approach could be adding “an interstitial that appears before a viewer can watch a borderline embedded or linked video letting them know the content may contain misinformation If YouTube were to prevent sharing of some videos it would be a dramatic step for the platform which has repeatedly cited statistics claiming that less than percent of views on borderline content comes from recommendations But critics have pointed out that this doesn t fully address the issue and fact checkers and misinformation researchers have cited YouTube as a major vector of misinformation Last month a group of fact checking organizations signed an open letter to the video platform urging it to do more to stop harmful misinformation and disinformation The YouTube exec hinted at other changes to come as well He said the company is also considering adding “additional types of labels to search results when there s a developing situation and authoritative information may not be available The company is also looking to beef up its partnerships with “with experts and non governmental organizations around the world and invest in technology to detect “hyperlocal misinformation with capability to support local languages |
2022-02-17 20:18:19 |
ビジネス |
ダイヤモンド・オンライン - 新着記事 |
トヨタ“EV350万台”達成には電池が足りない理由、「絶頂トヨタの死角」は欧州にある - 絶頂トヨタの死角 |
https://diamond.jp/articles/-/295838
|
|
2022-02-18 05:25:00 |
ビジネス |
ダイヤモンド・オンライン - 新着記事 |
ウクライナ「3つのシナリオ」、プーチンの新たな“危機創出”も - 政策・マーケットラボ |
https://diamond.jp/articles/-/295972
|
軍事協力 |
2022-02-18 05:20:00 |
ビジネス |
ダイヤモンド・オンライン - 新着記事 |
三菱商事の「洋上風力圧勝」でルール変更論が浮上、新入札制度へ爆速シフトで業界激震 - Diamond Premium News |
https://diamond.jp/articles/-/296620
|
三菱商事の「洋上風力圧勝」でルール変更論が浮上、新入札制度へ爆速シフトで業界激震DiamondPremiumNews三菱商事が秋田県沖の海域と千葉県銚子沖で計画されている洋上風力発電の入札で、三菱商事が件全てを落札したことを巡り、公募した経済産業省・資源エネルギー庁の入札制度や評価に対し、「資本力の勝負となっていいのか」などと疑問の声が上がっている。 |
2022-02-18 05:10:00 |
ビジネス |
ダイヤモンド・オンライン - 新着記事 |
仮想通貨への認識変えたFRB、金融リスクの一つに - WSJ発 |
https://diamond.jp/articles/-/296770
|
仮想通貨 |
2022-02-18 05:08:00 |
ビジネス |
ダイヤモンド・オンライン - 新着記事 |
ビジネスチャットで絶対にやってはいけない4つのこと - 最強の文章術 |
https://diamond.jp/articles/-/296141
|
業務改善 |
2022-02-18 05:05:00 |
ビジネス |
ダイヤモンド・オンライン - 新着記事 |
米国生まれグー選手は中国の誇り? 交錯する市民感情 - WSJ発 |
https://diamond.jp/articles/-/296771
|
市民感情 |
2022-02-18 05:03:00 |
ビジネス |
電通報 | 広告業界動向とマーケティングのコラム・ニュース |
オンラインとオフラインのはざまで(前編) |
https://dentsu-ho.com/articles/8051
|
早稲田大学 |
2022-02-18 06:00:00 |
サブカルネタ |
ラーブロ |
大津 天下ご麺 / あえて今、魂の一杯を求めたくなる・・・近江塩鶏麺 @滋賀県大津市浜大津 |
http://ra-blog.net/modules/rssc/single_feed.php?fid=196584
|
|
2022-02-17 20:15:00 |
北海道 |
北海道新聞 |
札幌で住宅火災、女性死亡 男性意識不明の重体 |
https://www.hokkaido-np.co.jp/article/647122/
|
意識不明 |
2022-02-18 05:12:00 |
北海道 |
北海道新聞 |
<社説>岸田首相会見 国民の不安解消されぬ |
https://www.hokkaido-np.co.jp/article/647091/
|
感染拡大 |
2022-02-18 05:01:00 |
ビジネス |
東洋経済オンライン |
コロナ禍でも「個人株主」が増えた会社トップ100 3位はJT、2位は日本航空、それでは1位は? | 企業ランキング | 東洋経済オンライン |
https://toyokeizai.net/articles/-/512220?utm_source=rss&utm_medium=http&utm_campaign=link_back
|
個人株主 |
2022-02-18 05:40:00 |
ビジネス |
東洋経済オンライン |
「夫セーフティネット」崩壊が突きつける過酷現実 働く女性を襲うコロナ禍の「沈黙の雇用危機」 | コロナ禍があぶりだした「女性の貧困」の深刻 | 東洋経済オンライン |
https://toyokeizai.net/articles/-/509194?utm_source=rss&utm_medium=http&utm_campaign=link_back
|
働く女性 |
2022-02-18 05:20:00 |
コメント
コメントを投稿