投稿時間:2022-11-28 17:35:38 RSSフィード2022-11-28 17:00 分まとめ(42件)

カテゴリー等 サイト名等 記事タイトル・トレンドワード等 リンクURL 頻出ワード・要約等/検索ボリューム 登録日
IT ITmedia 総合記事一覧 [ITmedia PC USER] TSUKUMO、Threadripper PRO+NVIDIA RTXの搭載に対応したプロ向けデスクトップWS https://www.itmedia.co.jp/pcuser/articles/2211/28/news145.html itmediapcusertsukumo 2022-11-28 16:32:00
IT ITmedia 総合記事一覧 [ITmedia ビジネスオンライン] 61歳以降も働きたい理由 3位「健康・体力維持のため」、2位「定期収入を得たい」、1位は? https://www.itmedia.co.jp/business/articles/2211/28/news121.html itmedia 2022-11-28 16:30:00
IT ITmedia 総合記事一覧 [ITmedia News] 私の「mixi黒歴史」を見てもらえないだろうか? https://www.itmedia.co.jp/news/articles/2211/28/news130.html itmedia 2022-11-28 16:28:00
IT ITmedia 総合記事一覧 [ITmedia News] W杯スペイン戦勝率は21.0%、AIが予想 JX通信社がシミュレーター開発 https://www.itmedia.co.jp/news/articles/2211/28/news143.html itmedianewsw 2022-11-28 16:21:00
IT ITmedia 総合記事一覧 [ITmedia News] 機能表よりも世界観のほうが大切だ マネーフォワード クラウド会計 vs. freee会計(後編) https://www.itmedia.co.jp/news/articles/2211/25/news021.html freee 2022-11-28 16:19:00
IT ITmedia 総合記事一覧 [ITmedia ビジネスオンライン] 日産、新型「セレナ」を発表 来春からe-POWER車を販売 https://www.itmedia.co.jp/business/articles/2211/28/news137.html epower 2022-11-28 16:05:00
IT 情報システムリーダーのためのIT情報専門サイト IT Leaders NEC、建設現場映像から複数作業者の作業内容をリアルタイムに認識する技術を開発 | IT Leaders https://it.impress.co.jp/articles/-/24103 NEC、建設現場映像から複数作業者の作業内容をリアルタイムに認識する技術を開発ITLeadersNECは年月日、建設現場などに設置したカメラの映像から、複数の人物のそれぞれ異なる作業内容をリアルタイムで高精度に認識する技術を開発したと発表した。 2022-11-28 16:19:00
AWS AWS How AWS Will Return More Water Than It Uses by 2030 | Amazon Web Services https://www.youtube.com/watch?v=zfVwuqAbfCQ How AWS Will Return More Water Than It Uses by Amazon Web ServicesAmazon Web Services s commitment to be water positive by means the company will return more water to communities and the environment than it uses in its operations Achieving this requires innovative solutions and collaborations involving AWS employees global nonprofits and public utilities ーall with the same goal in mind creating a better future for our planet Learn more about AWS s water stewardship here Subscribe More AWS videos More AWS events videos ABOUT AWSAmazon Web Services AWS is the world s most comprehensive and broadly adopted cloud platform offering over fully featured services from data centers globally Millions of customers ーincluding the fastest growing startups largest enterprises and leading government agencies ーare using AWS to lower costs become more agile and innovate faster AmazonSustainability AWS AmazonWebServices CloudComputing Sustainability AmazonSustainability WaterPositive Water org 2022-11-28 07:45:01
python Pythonタグが付けられた新着投稿 - Qiita pythonでの問題「環境構築」メモ ターミナルでpythonと打つとMicrosoft Storeが出ることへの対処 https://qiita.com/hit701/items/c3db83d73776e69a550d anacondapythonversion 2022-11-28 16:56:57
python Pythonタグが付けられた新着投稿 - Qiita maya python setShotInfo https://qiita.com/aizwellenstan/items/6179dab45d02b7dba408 cmdsimport 2022-11-28 16:04:43
Ruby Rubyタグが付けられた新着投稿 - Qiita 多対多のアソシエーションを設定する https://qiita.com/Naaa0/items/105f127f31888afc508f rails 2022-11-28 16:46:19
AWS AWSタグが付けられた新着投稿 - Qiita えぇ? Aurora&RDSがBlue/Greenデプロイをフルマネージドで提供してくれるように!? https://qiita.com/minorun365/items/348e8525bb488aecf013 aurorards 2022-11-28 16:49:24
Ruby Railsタグが付けられた新着投稿 - Qiita 多対多のアソシエーションを設定する https://qiita.com/Naaa0/items/105f127f31888afc508f rails 2022-11-28 16:46:19
技術ブログ Developers.IO [アップデート]AWS BackupにRedshiftがサポートされました https://dev.classmethod.jp/articles/aws-backup-redshift-support/ awsbackup 2022-11-28 07:40:18
技術ブログ Developers.IO 第13回東京Alteryxユーザーグループのイベントレポート https://dev.classmethod.jp/articles/alteryx_user_community_2022/ alteryx 2022-11-28 07:00:49
海外TECH DEV Community The illusion of speed – why perceived performance matters https://dev.to/enterspeed/the-illusion-of-speed-why-perceived-performance-matters-f38 The illusion of speed why perceived performance mattersTime is relative We have all heard this expression a million times before Many years ago some German dude went on and on about it and even developed some theories about the same thing But what does it actually mean With the risk of going all “Neil deGrasse Tyson and start answering questions that nobody has asked we can circle back to the aforementioned German dude The German dude let s call him Albert had a secretary who got burdened with questions from reporters about the meaning of relativity To help her out Albert told her to simply answer these questions with the following example “When you sit with a nice girl for two hours you think it s only a minute but when you sit on a hot stove for a minute you think it s two hours That s relativity Interesting This means that how we perceive time depends on our frame of reference Moreover the time we see on a clock doesn t necessarily reflect our “brain s time psychological time “Cool stuff nerd But what the heck does this have to do with website performance Hang on hang on I ll get to it Just like taking your partner on a date we must first set the mood So let me light some candles and throw on some Barry White before diving into it When we talk about web performance we often tend to focus solely on the metrics “What is your Lighthouse score What about your LCP And have you optimized your FID often gets thrown around We are so enthralled in those goddamn Lighthouse scores that we risk forgetting about the user “Wait what But I do this FOR the user Yes and that s great Optimizing your web vitals is important to the user experience but it doesn t tell you whether the user perceives your website as fast or slow You can have two websites with identical web vital metrics meaning one loads just as fast as the other where one is perceived as fast by the user and the other as slow I know it s weird We call this perceived performance Where traditional performance optimization focuses on objective speed perceived performance optimization focuses on subjective speed and according to Mozilla it s even more important than the “actual performance “The perception of how quickly and smoothly pages load and respond to user interaction is even more important that the actual time required to fetch the resources MDN Web Docs Just like any other subject in tech the rabbit hole goes quite deep when we start looking into perceived performance So deep in fact that a pages book called “Designing and Engineering Time The Psychology of Time Perception in Software has been written about it Therefore we re not going to follow the white rabbit to the end of the hole in this article but merely glance into it…we simply don t have the time bad dum tss What we want to do is to keep the user in a state of “flow But what is flow Authors Rina A Doherty and Paul Sorenson describe flow in their paper “Keeping Users in the Flow Mapping System Responsiveness with User Experience as “…flow is commonly described as the natural fluid state of being productively engaged with a task without being aware of the technology that is driving it As such if successful technology can become virtually forgotten when a user is immersed in the experience or is in the flow A way to make sure we don t break this flow is to avoid users going into the dreaded “passive state The active and the passive stateThe way humans perceive time depends on the activity they re doing An activity can be broken into two states the active state and the passive state In the active state we have a high level of mental activity and in the passive state our mental activity is low We often enter the passive state when we re waiting for something for instance standing in a queue Professor Richard Larson ran an experiment for this very thing His research showed that we tend to overestimate the time that has passed in the passive state by a whopping Source “Okay so avoid queues on my website Got it Why are you telling me this Because we are constantly switching between the active and the passive state including when we browse the web We of course are in the active state when we are navigating the web Can you guess when we enter the passive Bingo When we re waiting for a page to load The good news is we don t enter this state immediately It takes about a second for us to transition to this state according to research done by the Nielsen Norman Group This means that whenever the load time for a page is second or more we risk the user entering the passive state What s worse is they experience the load time as much longer than it is As we saw people on average overestimate the passive state by If we have a page that takes seconds to load the user will actually experience this as seconds slower than it really is The first second is in the active state but in the next three seconds the user risks moving to the passive state which they then will experience as seconds which will result in a total perceived load time of seconds You may be slamming your fist on the table by now saying “TIME IS TIME This isn t fair And no it isn t to both statements But luckily there are tricks we can use to overcome this hurdle We have two options here Avoid users entering the passive state by keeping them in the active state Making the passive states feel faster Let s look into each Keeping the use in the active state LoadersThe first thing we can do to keep the users in the active state is to be mindful of when we use loaders spinner loading bars etc I know this seems counterintuitive but hear me out Remember the scene in the first “The Fast and the Furious movie yes I m that old where Jesse presses the Nitrous too early Johnny then says “Too soon junior presses the nitrous and overtakes poor Jesse This is the same thing except not quite as cool We don t want to show the loaders too soon We know it takes the user about a second to naturally transition to the passive state This means that showing them the loaders when the wait loading is under second is not only unnecessary but it can also harm the perceived performance Since loaders tell the user that “now they have to wait it will force them from an active state into a passive state Therefore we should avoid using loaders if we anticipate the wait time to be under a second Avoid layout shiftsImage from Web dev Content “jumping around“can make it feel like the page is still loading and send the user into a passive state We call them layout shifts and they are often caused by images being loaded which then causes the content to “jump Another culprit can be custom fonts being downloaded If the custom font isn t the same size as the fallback font the font being loaded before showing the custom font it might cause layout shifts To avoid these layout shifts make sure you give your image element a width and a height required if you re using next image For layout shifts caused by fonts make sure to preload your custom fonts and find a fallback font that matches the custom font in size Layout shifts are measured in Google Lighthouse via the CLS Cumulative Layout Shift metric AnimationsAnother way to keep the user in the active state is by making sure your animations are smooth If your animations don t feel natural and don t move how the user might expect for instance a slight lag this can cause the experience to feel slow If the animations start feeling “laggy it can cause the user to transition to the passive state So make sure your animations aren t too complex for the user and that they feel natural A good way to provide a smooth experience is to render your animations at fps The active selector Image from Adobe com The secret to keeping the user in the active state is to provide immediate feedback The user likes to know that something is happening at the other end when they take an action This immediacy keeps the users from going into the passive state A super easy way to implement this in your UI is to add an active selector to all your button and link elements The active pseudo class represents that an element has been activated by the user Changing the background color of the button or shadow border color etc when the user clicks activating the active pseudo class provides the user with feedback that something is happening PreloadingAs the word might suggest preloading means that we load something in advance There are two ways we can think about preloading One way is that we start loading something which we will need later on Another way is to show something before it s done loading Let s look at each one In the first case loading something before we need it we anticipate what the user will do next Instagram for instance does this by beginning to upload your image as soon as you select it They anticipate you will publish your post so they do the heavy work before you hit the post button We can do a similar thing on our website by preloading pages before users navigate to them Frameworks like Next js have this feature built into their router In Next js when using next link you can add the “prefetch prop By adding this any link that is in the viewport initially or through scroll will be preloaded In the second case we start showing something before it s completely done loading We see this all the time on various streaming platforms If you open a YouTube video it will start playing even though it hasn t downloaded the whole video Instead it will estimate how fast you can stream it and wait for that portion of the video to be loaded This is the same as when we chose to load above the fold content first on our website We want the user to be able to interact with the page right away without having to load the entire page A way we achieve this is by lazy loading images and components as well as deferring non essential scripts Okay now let s look at how we can make the passive state feel faster Making the passive state feel fasterAll right then it s time to manipulate time Don t worry we re not going to create any time paradoxes we wouldn t want to unravel the very fabric of the space time continuum and destroy the entire universe Great Scott Instead we will use smoke and mirrors to make the passive state feel much faster than it really is Let s first look at a real world example Back in the early industrial age buildings started growing taller and taller which meant more people started using elevators Back then elevators were quite slow so naturally people did what people do best they began complaining about it As a result elevator companies started tackling this problem by designing elevators that were faster and safer Unfortunately at the time this was a very expensive thing to do However in another elevator company one engineer had a different take on the problem Speaking like a true software engineer he blamed the user as he said “I think our elevator speeds are just fine people are crazy This guy was basically a real life Principal Skinner His theory was that the problem wasn t about the elevators speed objective but rather that the users thought it was slow subjective So while the other companies went off trying to optimize their motor and pulley design this company tackled the problem differently they wanted to solve why the user thought it was slow Turns out that being trapped inside a metal box suspended many meters up in the air held up only by a metal wire isn t exactly fun The time seemed much longer when you had nothing to do but stare at the wall trying to cope with your fear of being trapped So how did they solve this problem They installed mirrors Yes mirrors Seems silly but that was really all it took Suddenly the users perceived the elevator rides as much faster than before Now the users could look at themselves check if their hair and makeup were okay and become distracted by their own reflection As an added benefit it helped people suffering from claustrophobia since the space now seemed bigger than it really was And by that we conclude today s history lesson Let s look at what you can do to make the passive state feel faster If we can t keep the users in the active state we have to utilize loaders in the best way possible Experiment with the design of the loaderThere are many ways to design a loading animation just look at cssloaders github io Probably the most famous loading animation is the spinner However even a spinner can be designed in multiple ways If you look at the different variations you will probably realize that you interpret some as slow whereas other seems fast As we learned earlier how an animation is designed can have a big impact on the user s sense of speed If you want to be devious like Facebook you can even try to shift the blame of a slow site app to the user s own operating system Doing an A B test they found out that by mimicking the iOS loading animation the user started to blame iOS for the slowness instead of Facebook Oh Zuck you sneaky Lizard man Use Skeleton loadersSkeleton loaders even though they might sound spooky and scary are a fantastic way to make your website seem faster A skeleton loader is an animation placeholder that mimics the structure and looks of the content being loaded They often have a gray color along with a pulsing or waving animation It gives the user an idea of how the content is going to look and feel You ll see skeleton loaders implemented on big sites like Facebook LinkedIn Medium etc Example of Skeleton loader from React Native Skeleton Content There are plenty of packages available for frameworks libraries like React Vue etc and if you re using a UI framework it might be built in as Chakra UI has Tell the users what s going onIf your website or app has a multi step process that takes time it can be beneficial to tell the user what s going on instead of simply saying “loading At Enterspeed when bulk deploying schemas we have implemented a loader that tells you if it s validating or deploying schema as well as how many schemas are left Often you won t get to see this because it goes fast but just in case you have a slow connection you can see what s going on Moreover if the process takes more than seconds you should switch the spinner out with a progress bar so the user can see the progress being made Image from Microsoft com Distract the user in fun waysFor processes that take a very long time you can try to find a fun way to distract the user A real world example is the crosswalk in Germany with a built in pong game To distract the user while they wait you can show them a random fact e g a random cat fact via the Cat Fact API or maybe something related to your industry For instance in a performance monitoring tool I helped build we implemented a random fact about Conversion Rate Optimization which changed every seconds while the user waited for their report to be generated Netlify also does a great job distracting the user while you wait for your site to deploy They give you the option to play a game while you wait and even show you the deployment process meanwhile how cool How do we measure perceived performance It s always quite difficult to measure and quantify a subjective thing The same goes for perceived performance We can interview our users and ask them if a site felt slow or fast using feedback plugins or by performing user tests However this can be difficult to do for smaller websitesLuckily we also have metrics available that we can use as indicators for perceived performance I began the article by saying “We are so enthralled in those goddamn Lighthouse scores that we risk forgetting about the user But truth be told Lighthouse has done a lot to make sure that a website is perceived as fast rather than just loads fast Back in the day we used to simply measure load time and page size these were the two metrics we were focusing on That was until one day Google changed the way we thought about performance by introducing Google Lighthouse …and of course by making it a ranking factor which we couldn t ignore The metrics we find in the Google Lighthouse performance report are a way to quantify and measure user experience here including the perceived performance Some of the metrics you can use as indicators are First Contentful PaintFirst Input Delay can t be measured synthetically First Meaningful PaintLargest Contentful PaintCumulative Layout ShiftSpeed IndexTime To InteractiveYou can run a Lighthouse test via Chrome DevTools or via PageSpeed Insights Just remember these are only indicators and not a metric on how fast your site “feels For this you need to ask the users ConclusionThe way humans perceive time doesn t necessarily match the time we see on a clock The way we perceive time depends on the activity we re doing Activities can be broken into two states the active state and the passive state In the active state we have a high mental activity whereas in the passive state our mental activity is low In the passive state we tend to overestimate the time passed by an average of We re constantly switching between the active and the passive state even when browsing the web When we wait for a page to load we risk moving into the passive state You can improve your website s perceived performance by avoiding users switching to the passive state If they do end in the passive state you can make tweaks to make the passive state feel faster It s difficult to measure and quantity perceived performance instead surveys and user tests can be used The metrics in Google Lighthouse can be helpful indicators for measuring perceived performance AcknowledgmentA huge shout out to Eli Fitch and his great video about “Perceived performance The only kind that really matters which helped form this article The same goes for Raelene Morey s fantastic article on “What is Perceived Performance and Why You Need to Optimize It and Luke Jones s awesome article on “A Designer s Guide to Perceived Performance Y all rock Examples as well as tips and tricks have been shamelessly stolen from these great resources 2022-11-28 07:50:59
海外TECH DEV Community WebRTC 102: Understanding libWebrtc https://dev.to/rishit/webrtc-102-1-understanding-libwebrtc-1g1e WebRTC Understanding libWebrtcJump to the libWebRTC section if you are in a hurry and not interested in a general overview of the WebRTC space If you ve missed Part here it is WebRTC Demystifying ICE Dyte IntroductionAnyone who has even slightly dwelled into the world of real time video audio communication over the internet would have surely heard about WebRTC The standard for ultra low latency real time media communications over the Internet You may have also tried out building a simple PP application using RTCPeerConnection or tinkered around with high level libraries such as Mediasoup Pion Jitsi to name a few But something is missing what really powers this new family of protocols How does it work If you re someone who is interested in getting into the nitty gritty of things to do with WebRTC this series is for you Why Should You Care About WebRTC As previously stated WebRTC is the protocol that powers all of the magic behind the scenes Understanding WebRTC will help you understand how to optimize for a wider range of environments enable you to make low level modifications to cater to your particular case and so on So without further ado let s get started A Small Refresher to the Current State of WebRTCTo understand why libWebRTC exists and why it is so important we should start by looking at its grassroots in when Google first announced a shiny new open source project for web browsers The project has since moved to a brand new website webrtc org and the source can now be found at This repository contains the native WebRTC library used by most modern browsers Chrome Firefox Opera etc Of course browser vendors rarely use the EXACT same library and versions due to compatibility reasons and differing release cycles For example here is Firefox s clone of libWebRTC on Github with their own modifications to the project The Actual WebRTC StandardlibWebRTC is amazing and everything but the real mission to maintain interoperability between multiple platforms browsers etc is realized by two projects IETF IETF is an organization that maintains a collection of documents with various standards for real time communication These documents do not cover the actual WebRTC API but rather the different codecs and protocols that are used by WebRTC clients to communicate with others For example ICE STUN and RTP W W maintains the spec for the official WebRTC API which you commonly see in browsers RTCPeerConnection All browsers are expected to comply with this specification to allow JavaScript clients to be cross compatible In actual practice however you would find that Chrome or Chromium is by far the closest implementation of this spec and other projects currently implement a subset of the standard When we refer to WebRTC as a whole we actually usually refer to these two specifications and not just chrome s WebRTC implementation Now that we have an idea of the WebRTC space let s dive deeper into libWebRTC libWebRTCLibWebRTC is a C C native implementation of the WebRTC API which is compatible with Windows MacOS and Linux On the mobile side of things it also provides Java and Objective C bindings for Android and iOS respectively Source for image BlogGeek meYes the project is open source but Google and other browsers Firefox Opera are the primary contributors and driving force behind the codebase since inception Despite the fact that the most popular use case for libWebRTC has been web browsers there is nothing stopping you from using it as a native library to communicate with other WebRTC peers Should you be using libWebRTC Unless you are someone particularly keen on dealing with C C and the million other pain points that come with it you might find it easier to use some other open source projects implementing the WebRTC specs which we cover below The Mediasoup project provides a high level JavaScript TypeScript interface to the WebRTC APIs The core logic of this project is implemented in C Rust Consider taking a look at the project if you want an easy to use library instead of the low level libWebRTC APIs A notable project to mention is the Pion webrtc project which has a Golang implementation of the WebRTC API Of course we should mention the rust port WebRTC rs Let s keep all the rustaceans happy too If ease of use is not particularly important and you are looking for cross platform support libWebRTC is definitely the safest bet to make native WebRTC clients that are compatible with browser clients from Chrome Firefox and other browsers Getting started with libWebRTCNow that we have talked about the WebRTC space lets go over how you can start working on the libWebRTC codebase Cloning the repoYou can clone the source directly using git clone but it is recommended that you install Google s Depot Tools which includes a set of tools for working with the project Start by cloning depot tools git clone lt The repository s root folder has executable tools that we ll use later so it can be convenient to add it to your PATH variable export PATH path to depot tools PATHNext we can get the libWebRTC source by running mkdir webrtc checkoutcd webrtc checkoutfetch nohooks webrtcgclient syncNow the src folder should have the entire source code from libWebRTC s main branch Building the sourceChromium libwebRTC and other Google projects use the ninja build system But before we can use ninja to build the project we must use another tool gn gn uses the configuration options in the BUILD gn file to generate ninja build files for multiple operating systems executables shared libraries etc The webrtc GNI file contains some boolean flags to control which components of the project are built which you can mess with if you wish to do so For now we will just stick to the defaults To generate ninja s build files run gn gen out DefaultThis will by default generate files for a DEBUG build that has extra data that we will use later while debugging an example on the repo Finally to start the build using ninja run ninja C out Default LibWebrtc s Threading ModelThe first step to working on any codebase is to locate the entry point of the code Since libWebRTC is a library it s hard to narrow it down to a single entry point But the majority of codebases will start by using the CreatePeerConnectionFactory function in the codebase which lets you easily create a WebRTC PeerConnection The first three parameters of this function are essential to understand the entire webRTC codebase These three thread objects are instances of the rtc Thread class and are used globally in the entire codebase to allocate different tasks on separate threads to help avoid processing tasks from bottlenecking network calls and external APIs network thread This thread is responsible for writing the actual media packets flowing through your peer connection and performing minimal processing tasks worker thread The most resource intensive of the three this thread deals with processing the individual video audio streams received sent to peers signaling thread This thread is where all the external API functions of the peer connection class run All external callbacks also run on this thread therefore it is essential for users of the library to not block inside callbacks since this will block the whole signaling thread Throughout the codebase libWebRTC uses the RTC DCHECK RUN ON macro to assert that function calls are running on the correct threads This is also super helpful for someone reading the codebase to know which thread each function runs on LibWebRTC s peer connection ExampleThe best way to get your hands dirty with libWebRTC is by using the peer connection example available in the src examples peerconnection directory of the project If you followed the steps previously covered in the “Building the source section the example should already have been built according to the default rules in the webrtc gni file Note As of the date of writing this article we are using the commit bdfaecaeedab Some of the information covered in this section might get outdated in the future The architecture of the exampleThe example outlines how to set up a simple peer connection to send a video stream from one client to another client and display it on the receiving end on top of a GTK window As we can see in the above diagram there are two components in this example the client and the server In this blog post we will concentrate on the client code as the server logic is irrelevant to LibWebRTC itself and is just a simple socket server that routes traffic between the clients The architecture of the client application is as follows Running the CodeIn the rest of this guide we ll assume that you are using a Linux machine To run the example first we need to start the server process We can run the binary we previously built by running src out Default peerconnection serverThis should start up the socket server on port Now spin up two more terminal windows and run the client application on each of them It will be useful to attach these windows to gdb This will help us debug errors when we make changes to the codebase gdb src out Default peerconnection clientrIf the clients start up correctly you should see two GTK windows popup Click on Connect on both of these windows and now you should see the list of peers connected on both windows Now click on the peer on any one of the windows This should set up a WebRTC connection that streams video from that window to the other window But at the time of writing this article this seems to fail with the following error Let s try debugging why this error happens Debugging Fixing the client binary crash Race condition in thread management We can run bt on our GDB window to view the call stack when the application crashed and see that the following functions were called PhysicalSocketServer Wait is invoked in the main thread of the client which blocks until we get a socket message On clicking the peer on the interface a socket message is received which triggers OnMessageFromPeer This triggers the creation of a new Peer connection which seems to call PhysicalSocketServer Wait recursively again…On debugging the code we found that this happens because of a caveat in the Threading Model we covered before The thread where we invoke the CreatePeerConnectionFactory is actually the main thread of our client application The constructor for the factory has a check which makes sure that all function calls run on the signaling thread But since we invoked the constructor from a different thread it tried to queue a call on the signaling thread and blocked the main thread until the function call was completed on the former This causes Wait to be called on our main application s thread recursively and the rtc Thread class doesn t allow Wait to be called twice in this manner We can prevent this issue by simply queuing the function call on the signaling thread instead of synchronously calling it on the main thread of our application Here is a patch on github gist with the fixes you can use this to apply the fixes on your local machine Now when we rebuild the files and run the code the video stream shows up correctly This works for a small duration but after a while it appears that the video stream gets stuck on the client application on the receiving end and now it crashes with a different segmentation fault error Let s look into why this happens Debugging Fixing the client binary crash Buffer overflow in Video Renderer On viewing the call stack we noticed that the crash occurs in the Redraw function in the main wnd cc file On analyzing OnRedraw we found that the function is responsible for transferring the video of the peer from the remote renderer to the draw buffer In the process the code seems to be scaling the width of the source video by x by duplicating adjacent pixels on the same row Since the memcpy is leading to a segfault after a duration it is possible that the size of the source video is increasing to a larger size after a while and since we initialize the draw buffer just once we ended up overflowing its bounds as it still has a smaller size We can confirm this by adding some log lines in the code to log the width and height of the remote renderer s video RTC LOG LS INFO lt lt Video size lt lt size We can fix this bug by re initializing the draw buffer each time the size of the remote renderer s video changes as outlined in this patch Now the video stream should render correctly for an indefinite period of time ConclusionCongratulation you reached the end of this article You should now have a basic understanding of libWebRTC and how to work with the project If you are interested in contributing you can head over to webrtc s bug tracker and try your hands at squashing some bugs Try Dyte if you don t want to deal with the hassle of managing your own peer to peer connections If you haven t heard of Dyte yet go to to learn how our SDKs and libraries revolutionize the live video and voice calling experience Don t just take our word for it try it for yourself Dyte offers free minutes every month to get you started quickly If you have any questions or simply want to chat with us please contact us through support or visit our developer community forum Looking forward to it 2022-11-28 07:07:12
海外TECH WIRED 46 Best Cyber Monday Fitness and Outdoor Deals (2022): Ebikes, Shoes, Hiking, Backpacks https://www.wired.com/story/best-cyber-monday-fitness-deals-outdoors-2022/ gadgets 2022-11-28 07:28:00
医療系 医療介護 CBnews 医師の引き揚げで支障出る恐れの病院を支援-宿日直許可申請の円滑化支援も、厚労省 https://www.cbnews.jp/news/entry/20221128162615 医療機関 2022-11-28 16:35:00
医療系 医療介護 CBnews 入院が大幅減、国保連9月審査分-件数伸び最大も医療費は低い伸び https://www.cbnews.jp/news/entry/20221128162325 国民健康保険 2022-11-28 16:35:00
金融 日本銀行:RSS 「決済の未来フォーラム デジタル通貨分科会:中央銀行デジタル通貨を支える技術(第5回会合)」傍聴者の募集について http://www.boj.or.jp/announcements/release_2022/rel221128c.htm 中央銀行 2022-11-28 17:00:00
金融 日本銀行:RSS 令和4年度上半期末の日本銀行保有外貨資産の残高 http://www.boj.or.jp/statistics/boj/other/other_gai/gai2209.pdf 外貨資産 2022-11-28 16:05:00
金融 ニュース - 保険市場TIMES イーデザイン損保、「安全な交通環境・社会の実現」に向けた地方自治体の企画を募集 https://www.hokende.com/news/blog/entry/2022/11/28/170000 イーデザイン損保、「安全な交通環境・社会の実現」に向けた地方自治体の企画を募集交通安全の取り組みに対して、イーデザイン損保が寄付イーデザイン損害保険株式会社以下、イーデザイン損保は年月日まで、「安全な交通環境・社会の実現」に向けた地方自治体の企画を募集している。 2022-11-28 17:00:00
ニュース @日本経済新聞 電子版 稲盛氏お別れの会 建築家安藤氏「ひたすら考え抜く人」 https://t.co/7AFOQmPDrP https://twitter.com/nikkei/statuses/1597134105144983553 稲盛 2022-11-28 07:43:38
ニュース @日本経済新聞 電子版 4~9月損益改善、首位は三菱商事 資源高や円安で明暗 https://t.co/0s9sYLfii1 https://twitter.com/nikkei/statuses/1597127838926536707 三菱商事 2022-11-28 07:18:44
ニュース @日本経済新聞 電子版 日銀、保有国債に含み損8749億円 異次元緩和下で初めて https://t.co/c0I7q5YjIY https://twitter.com/nikkei/statuses/1597125807792455682 異次元緩和 2022-11-28 07:10:40
海外ニュース Japan Times latest articles ‘COVID zero’ unrest presents biggest domestic challenge yet for China’s Xi https://www.japantimes.co.jp/news/2022/11/28/asia-pacific/china-protests-communist-party-xi-jinping-analysis/ COVID zero unrest presents biggest domestic challenge yet for China s XiDriven by the excessiveness of the anti COVID measures experts say the rapid and organic spread of protests could represent one of the toughest tests for 2022-11-28 16:25:03
海外ニュース Japan Times latest articles Ad agency Hakuhodo raided as Olympic bid-rigging investigation widens https://www.japantimes.co.jp/news/2022/11/28/national/crime-legal/ad-agencies-raided/ Ad agency Hakuhodo raided as Olympic bid rigging investigation widensThe search of Hakuhodo which ranks second after Dentsu means Japan s top three advertising agencies are now being investigated in the probe 2022-11-28 16:16:55
海外ニュース Japan Times latest articles Naoki Nakamura reaches World Cup podium for first time https://www.japantimes.co.jp/sports/2022/11/28/more-sports/winter-sports-more-sports/nakamura-first-podium/ Naoki Nakamura reaches World Cup podium for first timeSki jumper Naoki Nakamura earned his first career World Cup podium finish Sunday placing third behind joint winners Stefan Kraft of Austria and Halvor Egner 2022-11-28 16:53:01
海外ニュース Japan Times latest articles Abi achieves long-awaited goal with first Emperor’s Cup https://www.japantimes.co.jp/sports/2022/11/28/sumo/abi-dream-first-title/ Abi achieves long awaited goal with first Emperor s CupKyushu Grand Sumo Tournament winner Abi described his first career top division championship as a long time coming on Monday To win a championship was a dream 2022-11-28 16:38:42
ニュース BBC News - Home Palliative care: 'My dad should not have been expected to die in office hours' https://www.bbc.co.uk/news/health-63757539?at_medium=RSS&at_campaign=KARANGA research 2022-11-28 07:38:28
ニュース BBC News - Home Matt Hancock: MP finishes third on I'm A Celebrity as Jill Scott wins https://www.bbc.co.uk/news/entertainment-arts-63634455?at_medium=RSS&at_campaign=KARANGA warner 2022-11-28 07:46:53
GCP Google Cloud Platform Japan 公式ブログ Cloud Storage の Autoclass 機能で費用最適化の簡素化と自動化を実現 https://cloud.google.com/blog/ja/products/storage-data-transfer/optimize-your-cloud-storage-spend/ Autoclassは、お客様のワークロードに最も適した料金のストレージクラスにデータを移動させることで、自動的に費用を削減し非効率性を取り除くことができます。 2022-11-28 08:00:00
ニュース Newsweek 中国全土で反政府デモ、習近平に退陣を求める異例の事態 https://www.newsweekjapan.jp/stories/world/2022/11/post-100211.php 中国全土で反政府デモ、習近平に退陣を求める異例の事態独裁色を強める中国の習近平シー・チンピン国家主席がこだわる厳しいゼロコロナ政策に抗議するデモが中国各地で起きるなか、ソーシャルメディアに投稿された動画からは当局が暴力的な弾圧を行っていることが見て取れる。 2022-11-28 16:36:23
IT 週刊アスキー シービージャパン、最高温度が52度の安心設計「ママのためのドライヤー POPPO」 https://weekly.ascii.jp/elem/000/004/115/4115001/ poppo 2022-11-28 16:40:00
IT 週刊アスキー エルデンリング関連書籍購入で特製イラストカードがもらえるフェア、11月30日より開催 https://weekly.ascii.jp/elem/000/004/114/4114981/ 発売記念 2022-11-28 16:30:00
IT 週刊アスキー ホビー通販のあみあみがフィギュアの祭典「あみあみホビーキャンプ2022秋」開催! https://weekly.ascii.jp/elem/000/004/114/4114997/ 開催 2022-11-28 16:30:00
IT 週刊アスキー テックウインド、iPadをセカンドディスプレーにするアダプター「Luna Display」 https://weekly.ascii.jp/elem/000/004/115/4115000/ astrohqllc 2022-11-28 16:30:00
IT 週刊アスキー 来年は「楽天生命パーク宮城」改め、「楽天モバイルパーク宮城」に https://weekly.ascii.jp/elem/000/004/115/4115017/ 宮城球場 2022-11-28 16:55:00
IT 週刊アスキー 消費者庁、「LINE」を活用した消費生活相談の実証を開始 https://weekly.ascii.jp/elem/000/004/114/4114998/ 消費生活 2022-11-28 16:10:00
IT 週刊アスキー 『原神』Ver.3.3は12月7日開始!新キャラ「放浪者(CV:柿原徹也さん)」が登場 https://weekly.ascii.jp/elem/000/004/115/4115002/ cognosphere 2022-11-28 16:10:00
GCP Cloud Blog JA Cloud Storage の Autoclass 機能で費用最適化の簡素化と自動化を実現 https://cloud.google.com/blog/ja/products/storage-data-transfer/optimize-your-cloud-storage-spend/ Autoclassは、お客様のワークロードに最も適した料金のストレージクラスにデータを移動させることで、自動的に費用を削減し非効率性を取り除くことができます。 2022-11-28 08:00:00

コメント

このブログの人気の投稿

投稿時間:2021-06-17 05:05:34 RSSフィード2021-06-17 05:00 分まとめ(1274件)

投稿時間:2021-06-20 02:06:12 RSSフィード2021-06-20 02:00 分まとめ(3871件)

投稿時間:2020-12-01 09:41:49 RSSフィード2020-12-01 09:00 分まとめ(69件)