投稿時間:2021-09-01 21:37:47 RSSフィード2021-09-01 21:00 分まとめ(36件)

カテゴリー等 サイト名等 記事タイトル・トレンドワード等 リンクURL 頻出ワード・要約等/検索ボリューム 登録日
IT ITmedia 総合記事一覧 [ITmedia News] 日本初の磁気録音テープ「ソニ・テープ」、未来技術遺産に https://www.itmedia.co.jp/news/articles/2109/01/news170.html itmedia 2021-09-01 20:33:00
AWS AWS Japan Blog AWS Elemental MediaLiveとAmazon Rekognitionでリプレイ映像をリアルタイムで作成する~ テニスでの活用例 ~ https://aws.amazon.com/jp/blogs/news/jpmne-stream-tennis-matches-through-aws-elemental-medialive-and-generate-real-time-replays-with-amazon-rekognition/ この例では試合のビデオAmazonSバケットに保存されたMPファイルをAWSElementalMediaLiveの入力ソースとして使用し開発とテストのためにライブストリームのソースを複製しています。 2021-09-01 11:02:40
python Pythonタグが付けられた新着投稿 - Qiita 議事録ファイルをPython正規表現置換でDBに突っ込むINSERT文を作る https://qiita.com/JQinglong/items/3f1cc5343c5f1663a3e9 先ほど大臣がおっしゃいましたように、まずデータ連携の基盤がきちんとしていて、デ第回スーパーシティ型国家戦略特別区域の区域指定に関する専門調査会ジタル化の拠点に将来なるという要件がございますが、ここに関しては専門家の委員に後で御説明をお願いいたします。 2021-09-01 20:40:15
python Pythonタグが付けられた新着投稿 - Qiita [PyTorch1.9.0]LSTMを使って時系列(単純な数式)予測してみた https://qiita.com/sloth-hobby/items/93982c79a70b452b2e0a 2021-09-01 20:34:28
js JavaScriptタグが付けられた新着投稿 - Qiita [備忘録]フロント側の対応だけでNginxでキャッシュされた古い画像が表示されないようにする https://qiita.com/teracy55/items/c7159b37e1498e5c13f5 具体例参考にした投稿だとランダム値を指定してましたが、それだと常にキャッシュがスルーされてしまうのでレコードの更新日時を指定することにしました。 2021-09-01 20:08:10
Program [全てのタグ]の新着質問一覧|teratail(テラテイル) Railsでカートに商品を追加しようとすると、ActionView::MissingTemplate in Customers::Carts#my_cartというエラーが出ました。 https://teratail.com/questions/357278?rss=all ディレクトリーは図のようになっており、正しく記入できていると思うのですが、なぜエラーが起きているのでしょうか。 2021-09-01 20:58:41
Program [全てのタグ]の新着質問一覧|teratail(テラテイル) WordPressでREST APIを使用して、独自エンドポイントを作成したい https://teratail.com/questions/357277?rss=all そこでご質問したいのが、RESTAPIを使用してデータの受け渡しをしたいと思っているのですが、一覧は取得できたのですが、詳細ページの取得ができないので、どなたかご教授お願いできますでしょうか一応下記記事を参考に自分で独自エンドポイントを作成してみました。 2021-09-01 20:56:58
Program [全てのタグ]の新着質問一覧|teratail(テラテイル) foreachで取り出した配列を2つずつ、横に並べて表示させたい https://teratail.com/questions/357276?rss=all foreachで取り出した配列をつずつ、横に並べて表示させたい前提・実現したいことPHPでforeachで画面表示させている要素をつずつ横並びにさせたい。 2021-09-01 20:47:23
Program [全てのタグ]の新着質問一覧|teratail(テラテイル) djangoで一つの投稿のみ作成可能にし、更新を繰り返したい https://teratail.com/questions/357275?rss=all djangoで一つの投稿のみ作成可能にし、更新を繰り返したいDjangoでユーザーが投稿をし、更新できる機能をもつwebアプリを制作しています。 2021-09-01 20:01:31
Ruby Rubyタグが付けられた新着投稿 - Qiita [Ruby] AtCoder過去問 B - Card Game for Two https://qiita.com/minhee/items/865aa4523fb77e3d2d04 ngetstoipointsgetssplitmapamptoi続きまして、行目で受け取った配列の順番を後々、処理がしやすいように降順に並べ替えます。 2021-09-01 20:53:38
AWS AWSタグが付けられた新着投稿 - Qiita ローカルPCからAWS EC2にファイルをデプロイ https://qiita.com/ebtiag/items/dd61ef90dcccefc1066c ローカルPCからAWSECにファイルをデプロイローカルからAWSのECにファイルをデプロイするローカルのPC上からAWSのECにファイル、フォルダーをデプロイする方法についてまとめます。 2021-09-01 20:39:14
Docker dockerタグが付けられた新着投稿 - Qiita httpdイメージに指定されているコマンド「httpd-foreground」って何だ https://qiita.com/ryoutaka/items/343f048e6a3e0cefdab5 httpdイメージに指定されているコマンド「httpdforeground」って何だapachehttpdデーモンの起動はserviceコマンドやsystemctlコマンドで起動する事が多いと思いますがdockercontainerlsでhttpdイメージのCOMMANDの欄を見た時にhttpdforegroundが指定されていてこれ何だろうと思ったので調べてみました。 2021-09-01 20:30:47
Git Gitタグが付けられた新着投稿 - Qiita ローカルPCからAWS EC2にファイルをデプロイ https://qiita.com/ebtiag/items/dd61ef90dcccefc1066c ローカルPCからAWSECにファイルをデプロイローカルからAWSのECにファイルをデプロイするローカルのPC上からAWSのECにファイル、フォルダーをデプロイする方法についてまとめます。 2021-09-01 20:39:14
海外TECH Ars Technica Google Pixel 5a review: “Which Android phone should I buy?” This one https://arstechnica.com/?p=1789307 great 2021-09-01 11:15:52
海外TECH DEV Community Deploying Multi-Node Kubernetes Cluster on AWS Using Ansible Automation https://dev.to/surajwarbhe/deploying-multi-node-kubernetes-cluster-on-aws-using-ansible-automation-1b7e Deploying Multi Node Kubernetes Cluster on AWS Using Ansible Automation USE CASECreate Ansible Playbook to launch AWS EC InstanceCreate Ansible Playbook to configure Docker over those instancesCreate Playbook to configure KS Master KS Worker Nodes on the above created EC Instances using kubeadm Pre requisite FOR RHEL Controller node should be setup with ansible installation and configuration when controller node is RHELCreate one IAM user having Administrator Access and note down their access key and secret keyCreate one Key pair in pem format on AWS Cloud download it in your local system and transfer it over RHEL through WinSCP STEP Ansible Installation and ConfigurationInstall Ansible on Base OS RHEL configure ansible configuration file To do this use below commands yum install python ypip install ansible yvim etc ansible ansible cfgNOTE Python should be installed on your OS to setup Ansible Write below commands in your configuration ansible cfg file For this you can prefer any editor like vi vim gedit defaults inventory root ip txt inventory pathhost key checking Falsecommand warnings Falsedeprecation warnings Falseask pass Falseroles path root roles roles pathforce valid group names ignoreprivate key file root awskey pem your key pair remote user ec user privilege escalation become Truebecome method sudobecome user rootbecome ask pass False STEP Create Ansible RolesGo inside your roles workspacecd rolesUse Below commands to create different rolesFor Kubernetes ClusterFor Kubernetes MasterFor Kubernetes Slaves ansible galaxy init lt role name gt ansible galaxy init kube clusteransible galaxy init ks masteransible galaxy init ks slave STEP Write role for Kubernetes ClusterGo inside the tasks folder We have to write entire tasks inside this foldercd roles kube cluster tasksvim main ymlI am going to create cluster over Amazon Linux instances Write below source code inside it name Installing boto amp boto libraries pip name item state present loop lib names name Creating Security Group for Ks Cluster ec group name sg name description Security Group for allowing all port region region name aws access key access key aws secret key secret key rules proto all cidr ip rules egress proto all cidr ip name Launching three EC instances on AWS ec key name keypair instance type instance flavour image ami id wait true group sg name count vpc subnet id subnet name assign public ip yes region region name state present aws access key access key aws secret key secret key instance tags Name item register ec loop instance tag name Add st instance to host group ec master add host hostname ec results instances public ip groupname ec master name Add nd instance to host group ec slave add host hostname ec results instances public ip groupname ec slave name Add rd instance to host group ec slave add host hostname ec results instances public ip groupname ec slave name Waiting for SSH wait for host ec results instances public dns name port state started Explanation of Source Code We are using pip module to install two packages ーboto amp boto because these packages has the capability to contact to AWS to launch the EC instances ec group module to create Security Group on AWS ec module to launch instance on AWS register keyword will store all the Metadata in a variable called ec so that in future we can parse the required information from it loop which again using one variable which contains one list item keyword we are calling the list values one after another add host module which has the capability to create one dynamic inventory while running the playbook hostname keyword tells the values to store in the dynamic host group wait for module to hold the playbook for few seconds till all the node s SSH service started access key and secret key are stored inside vault files to hide it from other users Go inside the vars folder We have to write entire variables inside this folder We can directly mention variables inside tasks file but it is good practice to write them inside vars files so that we can change according to our requirements cd roles kube cluster varsvim main ymlWrite below source code inside it instance tag master slave slavelib names boto botosg name Allow All SGregion name ap south subnet name subnet feami id ami affedkeypair awskeyinstance flavour t small STEP Write role for Kubernetes MasterFollowing are the steps which have to include in role for configuring the ks master Installing docker and iproute tcConfiguring the Yum repo for KubernetesInstalling kubeadm kubelet amp kubectl programEnabling the docker and KubernetesPulling the config imagesConfiguring the docker daemon json fileRestarting the docker serviceConfiguring the Ip tables and refreshing sysctlStarting kubeadm serviceSetting HOME directory for kube DirectoryCopying file config fileInstalling Addons e g flannelCreating the tokenStore output of token in a file Go inside the tasks folder We have to write entire tasks inside this foldercd roles ks master tasksvim main ymlWrite below source code inside it name Installing docker and iproute tc package name docker iproute tc state present name Configuring the Yum repo for kubernetes yum repository name kubernetes description Yum for ks baseurl enabled yes gpgcheck yes repo gpgcheck yes gpgkey name Installing kubeadm kubelet kubectl program yum name kubelet kubectl kubeadm state present name Enabling the docker and kubenetes service name item state started enabled yes loop kubelet docker name Pulling the config images shell kubeadm config images pull name Confuring the docker daemon json file copy dest etc docker daemon json content exec opts native cgroupdriver systemd name Restarting the docker service service name docker state restarted name Configuring the Ip tables and refreshing sysctl copy dest etc docker daemon json content exec opts native cgroupdriver systemd name systemctl shell sysctl system name Starting kubeadm service shell kubeadm init ignore preflight errors all name Creating kube Directory file path HOME kube state directory name Copying file config file shell cp i etc kubernetes admin conf HOME kube config ignore errors yes name Installing Addons e g flannel shell kubectl apply f name Creating the token shell kubeadm token create print join command register token debug msg token stdout Explanation of Source Code We need to install kubeadm program on our master node to setup Ks cluster We are installing Docker Kubeadm amp iproute tc packages on our Master Instance service module is used to start the docker amp kubelet service command module to run kubeadm command which will pull all the Docker Images required to run Kubernetes Cluster We need to change our Docker default cgroup to systemd otherwise kubeadm won t be able to setup Ks cluster To do that at first using copy module we are creating one file etc docker daemon json amp putting some content in it Next using command module we are initializing the cluster amp then using shell module we are setting up kubectl command on our Master Node Next using command module I deployed Flannel on the Kubernetes Cluster so that it create the overlay network setup Also the nd command module is used to get the token for the slave node to join the cluster Using register I stored the output of nd command module in a variable called token Now this token variable contain the command that we need to run on slave node so that it joins the master node STEP Write role for Kubernetes SlavesFollowing are the steps which have to include in role for configuring the ks slaves Installing docker and iproute tcConfiguring the Yum repo for KubernetesInstalling kubeadm kubelet kubectl programEnabling the docker and KubernetesPulling the config imagesConfiguring the docker daemon json fileRestarting the docker serviceConfiguring the IP tables and refreshing sysctlCopy the join command which we store while configuring masterGo inside the tasks folder We have to write entire tasks inside this foldercd roles ks slave tasksvim main ymlWrite below source code inside it name Installing docker and iproute tc package name docker iproute tc state present name Configuring the Yum repo for kubernetes yum repository name kubernetes description Yum for ks baseurl basearch enabled yes gpgcheck yes repo gpgcheck yes gpgkey name Installing kubeadm kubelet kubectl program yum name kubelet kubectl kubeadm state present name Enabling the docker and kubenetes service name item state started enabled yes loop kubelet docker name Pulling the config images shell kubeadm config images pull name Confuring the docker daemon json file copy dest etc docker daemon json content exec opts native cgroupdriver systemd name Restarting the docker service service name docker state restarted name Configuring the Ip tables and refreshing sysctl copy dest etc sysctl d ks conf content net bridge bridge nf call iptables net bridge bridge nf call iptables name systemctl shell sysctl system name joining to Master command hostvars groups ec master token stdout STEP Write Ansible Vault FilesGo to your roles workspaceRun below command and create vault file ansible vault create lt filename gt ymlansible vault create cred ymlIt will ask to provide one vault password amp provide as per your choice Then open it with editor create two variables in this file amp put your AWS access key amp secret key as values For example access key ABCDEFGHIJKLMNsecret key abcdefghijklmnSave the file with command wq STEP Create Setup fileNow it s finally the time to create the setup yml file inside same workspace which we gonna run to setup this entire infrastructure on AWS hosts localhost gather facts no vars files cred yml tasks name Running kube cluster role include role name kube cluster hosts ec master gather facts no tasks name Running Ks Master Role include role name ks master hosts ec slave gather facts no tasks name Running Ks Slave Role include role name ks slaveWrite proper hostname vault file name and role name STEP RUN your Ansible Playbookuse below commands to run your ansible playbook ansible playbook setup yml ask vault passNext it will prompt you to pass the password of your Ansible Vault cred yml file provide your password YAY IT RUN SUCCESSFULLY AND SETUP ENTIRE INFRASTRUCTURE STEP TESTING Now lets check our multi node cluster is using below commandskubectl get nodes Here we can see our who cluster is launched successfully and our all nodes is ready phase Now lets create a deployment on master nodekubectl create deployment myd image httpdhere we can see our deployment is created successfully GitHub Link LinkedIn profile 2021-09-01 11:31:55
海外TECH DEV Community VueUse as must-have library for Vue 3 https://dev.to/harmyderoman/vueuse-as-must-have-library-for-vue-3-5o2 VueUse as must have library for Vue For those who are unfamiliar with this library I advise you to try it as it can de facto become the standard for use in Vue projects as for example once there was a lodash library for almost any js projects Others have probably already checked out all the extensive functionality that it provides Some have already used it in Vue but not all new features support the old version The library s arsenal is impressive there are simple utilities like getting mouse coordinates and various complex integrations with Firebase Axios Cookies QR local storage browser RxJS animation geolocation extensions for standard Vue hooks a media player and much more Evan You himself is noted among the sponsors of the library which seems to be a good sign The library receives regular updates bug fixes and the community grows So it has everything for success In this article I will only talk about features but of course all the others need attention onClickOutside clicks outside the elementI m sure you can handle the installation of the library by yourself so let s go straight to the interesting features To warm up consider a simple hook that tracks clicks outside a given element onClickOutside There are many packages that provide this functionality and almost everyone has probably written that function by themselves Usually it is implemented by adding a custom Vue directive to the desired element for example v clickOutside but the use of a hook is unusual I used this hook in my todo app in the ToDoItem vue component lt template gt lt li ref todoItem gt lt input type checkbox gt lt span v if editable click editable editable gt todo text todo text Click to edit Todo lt span gt lt input v else type text value todo text keyup enter editable editable gt lt li gt lt template gt lt script lang ts gt import defineComponent PropType ref from vue import ToDo from models ToDoModel import onClickOutside from vueuse core export default defineComponent name TodoItem props todo type Object as PropType lt ToDo gt required true setup const todoItem ref null const editable ref false onClickOutside todoItem gt editable value false return todoItem editable lt script gt I removed the extra code to avoid distraction but the component is still large enough Pay attention to the code inside the setup hook first we create an empty todoItem link which we hang on the desired element in the template and then we pass the first parameter to the onClickOutside hook and the second parameter is a callback with the actions we need When you click on the span tag it will be replaced with an input tag and if you click outside the li tag with the ref todoItem attribute then the input will be replaced with a span tag useStorage and createGlobalState reactive local storageThe next function I ll talk about is useStorage This function allows data to be stored in Window localStorage It is convenient to use it in conjunction with createGlobalState which is used to create a global storage Now the data will be saved updated and deleted automatically and will not disappear after the page is reloaded Below is the example of using these functions store index tsimport createGlobalState useStorage from vueuse core import Note from models NoteModel stateexport const useGlobalNotes createGlobalState gt useStorage my notes as Note actionsconst notes useGlobalNotes for local useexport const addNote function note notes value push note export const deleteGlobalNote function noteId number notes value notes value filter note gt note id noteId The first parameter of the useStorage function accepts a key under which it will save your data in localStorage and the second is initial value createGlobalState creates a wrapper function to pass state to components By calling this function in our case it is useGlobalNotes in Vue components or right here in this file we will get a reactive list of notes The notes array can be used as usual remembering that since this is a proxy object and the list itself is stored in notes value No need to add value to markup templates in components For comparison it s also helpful to see the useStorage example from the authors of the library The difference is that in setup you need to work with reactive storage not directly but through its value property In the html template everything is as usual useRefHistory history of changesuseRefHistory is a hook that will record the history of data changes and provide undo redo functionality I used it to create the Undo and Redo buttons on the note editing page First I created a reactive variable using ref I used it there and also got a typing error Let s take a closer look at the code Note vue lt template gt lt div gt lt button type button click undo disabled canUndo gt Undo lt button gt lt button type button click redo disabled canRedo gt Redo lt button gt lt div gt lt template gt lt script lang ts gt import defineComponent from vue import useRefHistory from vueuse core export default defineComponent setup const note ref title todos as ToDo const undo redo canUndo canRedo clear useRefHistory note deep true const updateTitle title string gt note value title title const addNewTodo gt note value todos push as ToDo const onRemoveTodo index number gt note value todos splice index return note addNewTodo onRemoveTodo updateTitle undo redo canUndo canRedo clear lt script gt We create a reactive variable using ref pass it to the useRefHistory hook denote deep true in the hook parameters for nested objects Using destructuring assignment from useRefHistory we get history undo redo canUndo canRedo and clear The canUndo and canRedo properties hang on the disabled attributes in buttons clear needed to clear history after finishing editing records The useManualRefHistory hook does almost the same thing but saving to history occurs only when the commit command is called ConclusionI have covered only functions from the large arsenal of VueUse tools for Vue development For a more in depth study I advise you to visit the site of this wonderful library While the documentation could still be improved it is regularly being updated as well as the library itself The complete code of my polygon where I tested this library can be viewed here 2021-09-01 11:26:15
海外TECH DEV Community How Coil supports the open-source projects we use https://dev.to/coil/how-coil-supports-the-open-source-projects-we-use-1fb1 How Coil supports the open source projects we useFor the past three months at Coil we have started to develop Rafiki an open source All In One Solution for Interledger Wallets Throughout the process we have continued to think more deeply about how we support the open source community and the packages that we use in Rafiki and at Coil In this blog post we are sharing some discoveries and decisions about what we want sustaining open source to look like at Coil Supporting open source projects and maintainersThinking about how we want to support the projects that we depend on we derived for principles to consider Equitably support all projectsCurrently when you want to fund or donate to an open source project your contribution only touches the single package you choose This approach benefits that package but leaves out the numerous packages that it depends on If those dependency packages are smaller don t have a very strong community to back them or do not run promotions to sustain the project the gap in support grows even more This is unfortunately built into most systems With funding options like GitHub Sponsors Open Collective Buy Me A Coffee and others donations only go to the single package and maintainers and not packages that they depend on Promote transparency for all maintainersWith many open source projects we can see the history of maintenance yet we cannot see a history of where donated funds go Funds sometimes go to events travel and compensation for the core team but not to the packages that enable the parent package to receive funds Fund projects that drive impact for the community and CoilWhen we use open source projects we want to be able to fund those projects and the projects they depend on All of the packages used all the way down the dependency treeshould receive some money from Coil s monthly donation Drive sustainabilityWe want our contribution to open source to drive sustainability for all projects that we use not only the extremely popular projects After all it s not just the popular projects that are used in our production codebase it s every dependency we have all the way down the dependency trees that make Coil as amazing as it is Title text Someday ImageMagick will finally break for good and we ll have a long period of scrambling as we try to reassemble civilization from the rubble Coil supports open source Projects with FlossbankAs we continue to promote a free and open web maintainers and communities continue to be a top priority That s why we ve sponsored student Hackathons with Major league Hacking and MozFest and WC s TPAC and that s also why we re announcing support for project maintainers through Flossbank We chose Flossbank as our donation mechanism because of its ability to traverse dependency trees and it s unique way of distributing our donation at each level It s by no means perfect but we re excited to see the effort being put toward a more equitable distribution of donations Some of the benefits of using Flossbank are Flossbank is maintenance free It s impossible for us to continuously determine what our top open source dependencies are so we re thrilled that Flossbank can automatically check out our GitHub to determine what packages should receive our donation and how much of the donation they should receive Flossbank supports a wide range of dependencies We believe our open source dependences are just as valuable as the code our engineers write down to each and every line Why should we not compensate the engineers maintaining the open source code the same way we compensate our own engineers Flossbank only take a fee to keep the lights on Compared to other donation mechanisms Flossbank takes only of a donation which means of our donation hits the maintainers bank accounts Compared to others which take this is a huge marginal impact increase Donation impactWith Flossbank we can measure the impact of our donations and see all the packages our donations have been allocated to Every month our donation is spread to the current packages that Coil is using Over the past few months this is how it s shaken out Total top level packages supportedDuring the month of August we supported top level Dependencies These are dependencies we use across all of our organization s Github repositories that are defined within our package manifests files like package json and requirements txt and Gemfile Total packages supportedDuring the month of August we supported Total Current Package Dependencies These include our top level packages as well as every dependency of those packages and dependencies of those packages etc etc For example we use and support node fetch and all of the dependencies that node fetch uses ConclusionAs Coil develops more open source technologies like Rafiki we think it s important to compensate developers who have open sourced their code and actively maintain it We think Flossbank is the first step to a more equitable and just method of doing so and we re excited to see where the open source community heads from here and how Coil will play a part in it To learn more about some projects Coil supports see Flossbank at flossbank comRafiki at github com interledger rafikiTigerbeetle at tigerbeetle comDeveloper events at developers coil com 2021-09-01 11:02:02
Apple AppleInsider - Frontpage News Pre 'AirPods 3' release, Apple losing some traction in wireless earbud market https://appleinsider.com/articles/21/09/01/pre-airpods-3-release-apple-losing-some-traction-in-wireless-earbud-market?utm_medium=rss Pre x AirPods x release Apple losing some traction in wireless earbud marketApple s existing AirPods range lost market share globally but the forthcoming AirPods is expected to change that Apple s AirPodsNew market research says that customer interest in the TWS Total Wireless Stereo market waned globally in Q compared to Q Across the US sales grew only from the previous quarter Read more 2021-09-01 12:00:00
Apple AppleInsider - Frontpage News 'No Chance' of blood pressure sensor for 'Apple Watch Series 7' https://appleinsider.com/articles/21/09/01/no-chance-of-blood-pressure-sensor-for-apple-watch-series-7?utm_medium=rss x No Chance x of blood pressure sensor for x Apple Watch Series x Following renewed rumors of a blood pressure sensor in the forthcoming Apple Watch Series Mark Gurman says there is no chance that the feature will debut in the fall Apple WatchThe new Apple Watch Series is reportedly seeing delays in manufacturer because of its complex new design Rumors about the device s new features included one that it would add a blood pressure sensor but those are now being denied by other sources Read more 2021-09-01 11:30:01
海外TECH Engadget Google is reportedly making its own ARM-based Chromebook processors https://www.engadget.com/google-chromebook-processor-114006740.html?src=rss Google is reportedly making its own ARM based Chromebook processorsIn the future Chromebooks may be powered by Google s own CPUs According to Nikkei Asia the company is developing processors for Chrome OS powered laptops and tablets in house It s not such a far fetched story seeing as the tech giant recently announced its own mobile chip called Tensor that s slated to debut on the Pixel and Pixel Pro Google hired chip engineers from around the world for that endeavor including talents from its suppliers like Intel and Qualcomm The company may have decided to use their expertise to work on a processor for Chromebooks as well Nikkei says the tech giant was inspired by the success of its rival when it comes to developing its own chips for the iPhone iPad and most recently Mac computers The first Macs and iPad Pros powered by Apple s M processor launched in while the first iMacs with the chip became available earlier this year Google s in progress Chromebook chip is reportedly based on designs from Softbank s ARM like most mobile processors out there By building the processor itself the company will be able to customize it to meet its needs and to add its own features It ll lessen Google s reliance on third party suppliers in other words allowing it to control production as it sees fit Google plans to release the processors it s developing for Chromebooks in Nikkei says The first devices powered by the chips could be available soon after 2021-09-01 11:40:06
海外TECH Engadget The Morning After: Windows 11 will be available (for some) on October 5th https://www.engadget.com/the-morning-after-windows-11-will-be-available-for-some-on-october-5th-111506973.html?src=rss The Morning After Windows will be available for some on October thMicrosoft has announced that Windows will be available on October th as a free upgrade for qualifying Windows systems as well as on new PCs shipping after that date But it isn t for everyone a gradual rollout will prioritize newer hardware and use quot intelligence models quot to determine who gets the upgrade first Microsoft will apparently factor in reliability and device age It could be the case that friends and family utterly disinterested in an OS update could be offered it ahead of anyone champing at the bit for the latest edition of Windows Check out Engadget s Devindra Hardawar s preview on what to expect from Windows All supporting machines will get the update by mid if you can think that far ahead For those at the head of the line you ll still miss out on Android app support which won t be available on launch Microsoft plans to introduce the feature in a Windows Insider preview build sometime in the quot coming months quot ーMat SmithApple s rumored iPhone satellite support may be for emergency calls and messagesBut the service may not be ready for the next iPhone A rumored satellite feature for future iPhones is reserved for emergency uses only according to Bloomberg s Mark Gurman A few days ago another report said the next iPhones will come with support for Low Earth Orbit satellite calls and messages Other sources said however that Apple is reportedly developing at least two emergency related features relying on satellite networks The first feature Emergency Message via Satellite will be added as a third protocol alongside iMessage and SMS to the Messages app Apple is also reportedly working on a second satellite feature for users to report crisis situations like plane crashes and fires Continue reading Best Buy is now selling e bikes and electric scootersIt s online only at first You can now visit Best Buy s website to purchase powered bicycles as well as electric scooters and mopeds from brands like Unagi Bird Segway and SWFT In October Best Buy will begin stocking those same EVs in select stores across the US including Los Angeles New York and San Francisco The company s Geek Squad will even offer a service where it comes to your house to adjust the brakes seat height and handlebars for you Continue reading Polaroid s Now connected camera comes with five clip on lens filtersThe snapper works with the redesigned Polaroid app PolaroidPolaroid s Now is a analog camera with Bluetooth connectivity and five physical lens filters It s the first time the company has included the latter out of the box You can clip the filters on to the camera s lens to saturate or deepen the contrast of your photos or add new effects like starburst red vignette and orange blue and yellow colors Continue reading NVIDIA s latest tech makes AI voices more expressive and realisticOne new tool lets you record your own voice to train the tone of an AI voiceThe voices on Amazon s Alexa Google Assistant and other AI assistants are far ahead of old school GPS devices but they still lack the finer qualities that make speech sound well human NVIDIA has unveiled new research and tools that can capture those natural speech qualities by letting you train the AI system with your own voice To improve its AI voice synthesis NVIDIA s text to speech research team developed a model called RAD TTS The system allows an individual to train a text to speech model with their own voice including the pacing tonality timbre and more Maybe it s time to build an AI voice so I don t have to record Engadget s The Morning Edition podcast each day Don t tell anyone OK Continue reading South Korea will force Google and Apple to allow third party paymentsThe bill could have global ramifications South Korea has today passed a law requiring major app stores to allow alternate payment methods The bill due to be rubber stamped by President Moon Jae in forces platform holders to open up their stores which will affect their lucrative commissions on digital sales Apple and Google are facing lawsuits and regulator investigations in multiple countries many focused on how their app stores operate Continue reading Bose QuietComfort headphones have improved ANC and a familiar designThe new model also has better battery life and a lower price Finally s QuietComfort II headphones have a proper replacement the QuietComfort Bose says these headphones quot maintain the hallmarks of their predecessor quot when it comes to audio quality comfort reliable controls and more The QC also keeps nearly the same design as the QC and QC II except for a few subtle changes ーand a longer battery life estimated to be around hours At they ll debut at a lower price than the previous two QuietComfort models The QuietComfort headphones will be available in black and light gray on September rd Pre orders begin today at Amazon and Bose s website Continue reading Jabra s new feature packed Elite true wireless earbuds are only Ambient sound customizable settings and good battery life When it comes to true wireless earbuds Jabra has continued to improve design features and technology since the Elite t But there was one thing it was still missing a low cost model for around So say hello to the Elite an set of true wireless earbuds with more features than we re used to seeing at this price Continue reading All the big news you might have missedTwitch streamers are taking a day off to protest hate raids AppleToo starts publishing employees toxic workplace storiesJabra promises clearer calls with its Elite Pro noise canceling earbudsNetgear s G mobile hotspot router with WiFi is now available for Amazon s Echo Show s are cheaper than ever starting at Samsung s Galaxy Watch gets an official walkie talkie appPunishing platformer Ghostrunner adds accessibility mode 2021-09-01 11:15:06
医療系 医療介護 CBnews 日医会長「長期戦を覚悟」-今冬のインフルワクチン接種予約で注意喚起も https://www.cbnews.jp/news/entry/20210901203550 中川俊男 2021-09-01 21:00:00
海外ニュース Japan Times latest articles Japan considers two-week extension of COVID-19 emergency as cases remain high https://www.japantimes.co.jp/news/2021/09/01/national/two-week-emergency-extension/ Japan considers two week extension of COVID emergency as cases remain highWhile new infections have been decreasing in some areas the country is still struggling to contain surging cases and the strain they impose on the 2021-09-01 20:28:26
ニュース BBC News - Home Colin Pitchfork: Double child murderer released from prison https://www.bbc.co.uk/news/uk-england-leicestershire-58408210?at_medium=RSS&at_campaign=KARANGA leicestershire 2021-09-01 11:45:52
ニュース BBC News - Home Smith retains boccia title as Reid & Hewett reach wheelchair tennis last four - day seven so far https://www.bbc.co.uk/sport/disability-sport/58405624?at_medium=RSS&at_campaign=KARANGA Smith retains boccia title as Reid amp Hewett reach wheelchair tennis last four day seven so farBritain s David Smith retains his Paralympic boccia individual title with a thrilling victory in Tokyo in the BC final 2021-09-01 11:05:43
LifeHuck ライフハッカー[日本版] ハンドルに立体感ある握り心地をプラス。安全運転をサポートする車用ハンドルカバー https://www.lifehacker.jp/2021/09/machi-ya-zeloslip-start.html machiya 2021-09-01 21:00:00
北海道 北海道新聞 羅臼の成人式実行委23人が会食 参加者から感染者 町教委、中止求めず https://www.hokkaido-np.co.jp/article/584678/ 実行委員 2021-09-01 20:11:00
北海道 北海道新聞 緊急事態の全面解除は困難 12日期限、医療逼迫続く https://www.hokkaido-np.co.jp/article/584677/ 緊急事態 2021-09-01 20:09:00
北海道 北海道新聞 バドの「ナガマツ」、五輪を総括 世界選手権3連覇へ意欲 https://www.hokkaido-np.co.jp/article/584676/ 北都銀行 2021-09-01 20:07:00
北海道 北海道新聞 モデルナ副反応、女性に多く発生 長崎国際大が調査 https://www.hokkaido-np.co.jp/article/584675/ 長崎県佐世保市 2021-09-01 20:05:00
北海道 北海道新聞 デジタル改革推進へ、司令塔発足 首相、行政手続きの改善指示 https://www.hokkaido-np.co.jp/article/584674/ 行政手続き 2021-09-01 20:01:00
北海道 北海道新聞 首相、9月の衆院解散否定 二階幹事長の後任焦点 https://www.hokkaido-np.co.jp/article/584673/ 新型コロナウイルス 2021-09-01 20:01:00
ニュース Newsweek 「それでもルーティンは欠かせない」ハリケーン渦中の路上でワークアウトする男たち https://www.newsweekjapan.jp/stories/world/2021/09/post-97017.php 「それでもルーティンは欠かせない」ハリケーン渦中の路上でワークアウトする男たちハリケーンは何十年、何世紀にわたってメキシコ湾岸に暮らす人々の生活に甚大な被害をもたらしてきた。 2021-09-01 20:35:00
IT 週刊アスキー PS4版ケモノオープンワールドRPG『バイオミュータント』が最大25%オフのセール中! https://weekly.ascii.jp/elem/000/004/067/4067913/ playstationstore 2021-09-01 20:20:00
IT 週刊アスキー ユービーアイソフトのユーザー参加型イベント『UBISOFT DAY 2021 ONLINE』が10月3日に開催決定! https://weekly.ascii.jp/elem/000/004/067/4067917/ ubisoftday 2021-09-01 20:20:00
GCP Cloud Blog How to load Salesforce data into BigQuery using a code-free approach powered by Cloud Data Fusion https://cloud.google.com/blog/products/data-analytics/load-salesforce-data-to-bigquery-with-cloud-data-fusion/ How to load Salesforce data into BigQuery using a code free approach powered by Cloud Data FusionOrganizations are increasingly investing in modern cloud warehouses and data lake solutions to augment analytics environments and improve business decisions The business value of such repositories increases as customer relationship data is loaded and additional insights are generated In this post we ll cover different ways to incrementally move Salesforce data into BigQuery using the scalability and reliability of Google services an intuitive drag and drop solution based on pre built connectors and the self service model of a code free data integration service  A Common Data Ingestion Pattern To provide a little bit more context here is an illustrative and common use case Account Lead and Contact Salesforce objects are frequently manipulated by call center agents when using the SalesForce application Changes to these objects need to be identified and incrementally loaded into a data warehouse solution using either a batch or streaming approach A fully managed and cloud native enterprise data integration service is preferred for quickly building and managing code free data pipelines   Business performance dashboards are created by joining Salesforce and other related data available in the data warehouse Cloud Data Fusion to the rescue To address the Salesforce ETL extract transform and load scenario above we will be demonstrating the usage of Cloud Data Fusion as the data integration tool  Data Fusion is a fully managed cloud native enterprise data integration service for quickly building and managing code free data pipelines Data Fusion s web UI allows organizations to build scalable data integration solutions to clean prepare blend transfer and transform data without having to manage the underlying infrastructure Its integration with Google Cloud ensures data is immediately available for analysis  Data Fusion offers numerous pre built plugins for both batch and real time processing These customizable modules can be used to extend Data Fusion s native capabilities and are easily installed though the Data Fusion Hub component For Salesforce source objects the following pre built plugins are generally available Batch Single Source Reads one sObject from Salesforce The data can be read using SOQL queries Salesforce Object Query Language queries or using sObject names You can pass incremental range date filters and also specify primary key chunking parameters Examples of sObjects are opportunities contacts accounts leads any custom object etc  Batch Multi Source Reads multiple sObjects from Salesforce It should be used in conjunction with multi sinks Streaming Source Tracks updates in Salesforce sObjects Examples of sObjects are opportunities contacts accounts leads any custom object etc If none of these pre built plugins fit your needs you can always build your own by using Cloud Data Fusion s plugin APIs  For this blog we will leverage the out of the box Data Fusion plugins to demonstrate both batch and streaming Salesforce pipeline options Batch incremental pipelineThere are many different ways to implement a batch incremental logic The Salesforce batch multi source plugin has parameters such as “Last Modified After “Last Modified Before “Duration and “Offset which can be used to control the incremental loads Here s a look at a sample Data Fusion batch incremental pipeline for Salesforce objects Lead Contact and Account The pipeline uses the previous start end time as the guide for incremental loads Batch Incremental Pipeline From Salesforce to BigQueryThe main steps of this sample pipeline are For this custom pipeline we decided to store start end time in BigQuery and demonstrate different BigQuery plugins When the pipeline starts timestamps are stored on a user checkpoint table in BigQuery This information is used to guide the subsequent runs and incremental logic Using the BigQuery Argument Setter plugin the pipeline reads from the BigQuery checkpoint table fetching the minimum timestamp to read from With the Batch Multi Source plugin the objects lead contact and account are read from Salesforce using the minimum timestamp as a parameter passed to the plugin BigQuery tables lead contact and account are updated using the BigQuery Multi Table sink pluginThe checkpoint table is updated with the execution end time followed by an update to current time column Adventurous You can exercise this sample Data Fusion pipeline in your development environment by downloading its definition file from GitHub and importing it through the Cloud Data Fusion Studio After completing the import adjust the plugin properties to reflect your own Salesforce environment You will also need to       Create a BigQuery dataset named from salesforce cdf stagingCreate the sf checkpoint BigQuery table on dataset from salesforce cdf staging as described below Insert the following record into the sf checkpoint table Attention The initial last completion date  “ T Z indicates the first pipeline execution will read all Salesforce records with LastModifedDate column greater than This is a sample value targeted for initial loads Adjust the last completion column as needed to reflect your environment and requirements for the initial run After executing this sample pipeline a few times observe how sf checkpoint last completion column evolves as executions finish You can also validate that changes are being loaded incrementally into BigQuery tables as shown below BigQuery output Salesforce incremental pipelineStreaming pipeline  When using the Streaming Source plugin with Data Fusion changes in Salesforce sObjects are tracked using PushTopic events The Data Fusion streaming source plugin can either create a Salesforce PushTopic for you or use an existing one you defined previously using Salesforce tools  The PushTopic configuration defines the type of events insert update delete to trigger notifications and the objects columns in scope To learn more about Salesforce PushTopics click here    When streaming data there is no need to create a checkpoint table in BigQuery as data gets replicated near real time automatically capturing only changes as soon as they occur The Data Fusion pipeline becomes super simple as demonstrated in the sample below Salesforce streaming pipeline with Cloud Data FusionThe main steps of this sample pipeline are Add a Salesforce streaming source and provide its configuration details For this exercise only inserts and updates are being captured from CDFLeadUpdates PushTopic As a reference here is the code we used to pre create the CDFLeadUpdates PushTopic in Salesforce The Data Fusion plugin can also pre create the PushTopic for you if preferred Hint In order to run this code block login to Salesforce with the appropriate credentials and privileges open the Developer Console and click on Debug Open Execute Anonymous Window Add a BigQuery sink to your pipeline in order to receive the streaming events Notice the BigQuery table gets created automatically once the pipeline executes and the first change record is generated After starting the pipeline make some modifications to the Lead object in Salesforce and observe the changes flowing into BigQuery as exemplified below BigQuery output Salesforce streaming pipeline with Cloud Data FusionAdventurous You can exercise this sample Data Fusion pipeline in your development environment by downloading its definition file from GitHub and importing it through the Cloud Data Fusion Studio After completing the import adjust the plugin properties to reflect your own Salesforce environment Got deletes   If your Salesforce implementation allows “hard deletes and you must capture them here is a non exhaustive list of ideas to consider An audit table to track the deletes A database trigger for example can be used to populate a custom audit table You can then use Data Fusion to load the delete records from the audit table and compare update the final destination table in BigQuery An additional Data Fusion job that reads the primary keys from the source and compare merge with the data in BigQuery A Salesforce PushTopic configured to capture delete undelete events and a Data Fusion Streaming Source added to capture from the PushTopic Salesforce Change Data Capture Conclusion If your enterprise is using Salesforce and If it s your job to replicate data into a data warehouse then Cloud Data Fusion has what you need And if you already use Google Cloud tools for curating a data lake with Cloud Storage Dataproc BigQuery and many others then Data Fusion integrations make development and iteration fast and easy  Have a similar challenge Try Google Cloud and this Cloud Data Fusion quickstart next  For a more in depth look into Data Fusion check out the documentation Have fun exploring 2021-09-01 12:00:00

コメント

このブログの人気の投稿

投稿時間:2021-06-17 05:05:34 RSSフィード2021-06-17 05:00 分まとめ(1274件)

投稿時間:2021-06-20 02:06:12 RSSフィード2021-06-20 02:00 分まとめ(3871件)

投稿時間:2020-12-01 09:41:49 RSSフィード2020-12-01 09:00 分まとめ(69件)