IT |
気になる、記になる… |
楽天モバイル公式 楽天市場店、対象のApple製品が最大12,000円オフになるキャンペーンを開催中(11月11日まで) |
https://taisy0.com/2021/11/04/148352.html
|
apple |
2021-11-04 06:18:16 |
TECH |
Engadget Japanese |
ドコモ版 Xperia 5 IIIも11月12日発売決定 |
https://japanese.engadget.com/docomo-xperia5iii-064013411.html
|
xperia |
2021-11-04 06:40:13 |
TECH |
Engadget Japanese |
iPhone 14(仮)のSoCは「N4P」製造との噂。5nmプロセスの改良版で4nmではない? |
https://japanese.engadget.com/iphone14-a16-n4p-process-060057368.html
|
iphone |
2021-11-04 06:00:57 |
IT |
ITmedia 総合記事一覧 |
[ITmedia PC USER] MSI、165Hz駆動対応のVAパネル採用23.8型フルHD液晶ディスプレイ Amazon限定販売 |
https://www.itmedia.co.jp/pcuser/articles/2111/04/news123.html
|
amazon |
2021-11-04 15:35:00 |
IT |
ITmedia 総合記事一覧 |
[ITmedia ビジネスオンライン] 日産、人気オプションを付けたノートの特別仕様車「Airy Gray Edition」を発売 |
https://www.itmedia.co.jp/business/articles/2111/04/news118.html
|
airygrayedition |
2021-11-04 15:20:00 |
IT |
ITmedia 総合記事一覧 |
[ITmedia Mobile] au、Xperia 1 III/5 IIIの「5G機種変更おトク割」を1万1000円に増額 |
https://www.itmedia.co.jp/mobile/articles/2111/04/news120.html
|
itmediamobileau |
2021-11-04 15:19:00 |
IT |
ITmedia 総合記事一覧 |
[ITmedia Mobile] 楽天モバイル「Rakuten UN-LIMIT VI」は絶対的にお得なのか? 容量別に検証してみた |
https://www.itmedia.co.jp/mobile/articles/2111/04/news119.html
|
itmediamobile |
2021-11-04 15:18:00 |
TECH |
Techable(テッカブル) |
最大走行距離80km!パワフル×スタイリッシュな電動アシスト自転車「X20」 |
https://techable.jp/archives/165930
|
funstandard |
2021-11-04 06:00:57 |
IT |
情報システムリーダーのためのIT情報専門サイト IT Leaders |
ハウス食品グループ3社がサプライチェーン管理を統合、AIによる需給調整で食品ロスを削減 | IT Leaders |
https://it.impress.co.jp/articles/-/22283
|
ハウス食品グループ社がサプライチェーン管理を統合、AIによる需給調整で食品ロスを削減ITLeadersハウス食品は年月日、グループ社ハウス食品、ハウスウェルネスフーズ、サンハウス食品のSCMサプライチェーン管理システムを年月に統合したと発表した。 |
2021-11-04 15:38:00 |
python |
Pythonタグが付けられた新着投稿 - Qiita |
DockerでAWSのEC2上に分析環境を作る |
https://qiita.com/nessyyamamoto/items/245a2b5b16f935e87b85
|
・コンテナ作成までの流れ実際に運用するのは作成したコンテナ中ということになりますが、他者と共有するのはそれを作成する元となる、DockerimageやDockerfileになります。 |
2021-11-04 15:25:44 |
python |
Pythonタグが付けられた新着投稿 - Qiita |
Paramikoでget_pty=Trueにしてsudoコマンドを実行してパスワードを入力しても反応がないとき |
https://qiita.com/huran0209/items/ccf14309f223ef713bdc
|
ParamikoでgetptyTrueにしてsudoコマンドを実行してパスワードを入力しても反応がないときことはじめ設定が間違ってないかどうか、全ホストのとあるconfigファイルをちょっと確認したかったrootでしか表示できないファイルだったのでcatするにしてもsudoパスワード入力が必要だったexpectを使ったスクリプトよりpythonの方がわかりやすいと思った現象kanedaqさんのPythonコードから多段SSHでホストに接続し、コマンドを実行してstdout出力を受け取るとかを参考にコードを書いたのだけど、なぜか動かない。 |
2021-11-04 15:07:43 |
js |
JavaScriptタグが付けられた新着投稿 - Qiita |
firebase ~環境構築(Hosting)~ |
https://qiita.com/hirokibdd/items/191648947551eb485093
|
firebase環境構築Hosting自分の勉強の為のメモとして書きます。 |
2021-11-04 15:42:44 |
js |
JavaScriptタグが付けられた新着投稿 - Qiita |
【続】あなたの知らないディズニーキャラクターの世界 #JavaScript #HTML #CSS #codepen #Netlify |
https://qiita.com/kk_puruzera/items/cc6b3d6a3ed9bc386210
|
使用したものCodePenNetlifyDisneyAPI参考にしたCodePenExtremeHoverHTMLCSS料理とお買い物メモ作成過程の記録用としてZenn完成コード今まで直書きしていましたが、今回はCodePenを埋め込んでおきます。 |
2021-11-04 15:08:38 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
forループ内でJson配列のKeyとvalueをechoしたいです |
https://teratail.com/questions/367726?rss=all
|
forループ内でJson配列のKeyとvalueをechoしたいです困ってることSPREADSHEETaaaaaaGOOGLECLIENTbbbbbGOOGLECLIENTccccccGOOGLEACCESSddddddGOOGLEREFRESHeeeeeeこの配列をforループでkeyとvalueを出力したいです。 |
2021-11-04 15:58:21 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
マークダウンのヘッダ背景を変更したい。正規表現の指定方法 |
https://teratail.com/questions/367725?rss=all
|
マークダウンのヘッダー背景色を変更するために使用しています。 |
2021-11-04 15:54:25 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
herokuデプロイ時のエラーにつきまして |
https://teratail.com/questions/367724?rss=all
|
herokuデプロイ時のエラーにつきまして前提・実現したいこと【結論】railsアプリ作成中で、herokuでデプロイをしたいのですが、「gitnbsppushnbspherokunbspmasterを入力すると、下記エラーが出てしまいデプロイができない状況です。 |
2021-11-04 15:53:58 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
Unityについての質問 |
https://teratail.com/questions/367723?rss=all
|
dacfddfeffbaceccpng |
2021-11-04 15:46:35 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
seleniumを使用し、chromeを開くと、処理後に消える |
https://teratail.com/questions/367722?rss=all
|
seleniumを使用し、chromeを開くと、処理後に消えるWebスクレイピング実行環境macOSnbspMontereyjupyternotebooknbsppythonnbspseleniumnbspgooglenbspchromenbspchromedrivernbspwebスクレイピングのため、chromeを立ち上げると、処理開始時はブラウザが立ち上がるのですが、処理が終わるとブラウザが落ちて、以下のエラーメッセージが発生しました。 |
2021-11-04 15:41:35 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
LoadSceneAsyncを任意のタイミングでtrueにしたい |
https://teratail.com/questions/367721?rss=all
|
LoadSceneではシーンの遷移がとても遅いので、LoadSceneAsyncを用いて事前にシーンをロードしておき、任意のタイミングでシーン遷移をしたいです。 |
2021-11-04 15:35:26 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
evalでコマンドを展開するとシングルクォーテーションが消える |
https://teratail.com/questions/367720?rss=all
|
evalでコマンドを展開するとシングルクォーテーションが消えるパスワードをハッシュ化するシェルスクリプトを作成しております。 |
2021-11-04 15:33:27 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
evalでコマンドを展開するとシングルクォーテーションが消える |
https://teratail.com/questions/367719?rss=all
|
evalでコマンドを展開するとシングルクォーテーションが消えるパスワードをハッシュ化するシェルスクリプトを作成しております。 |
2021-11-04 15:28:57 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
おみくじ:ボタンを押すと画像が動くことについて |
https://teratail.com/questions/367718?rss=all
|
二つボタンを作り、つ目のボタンを押すとおみくじを引き、つ目のボタンで、リロードします。 |
2021-11-04 15:27:35 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
Unity エディタ画面でゲーム実行中画面がちらつく |
https://teratail.com/questions/367717?rss=all
|
Unityエディタ画面でゲーム実行中画面がちらつくお世話になります。 |
2021-11-04 15:26:15 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
pythonのwaveモジュールの関数getnframesが0になってしまいます。 |
https://teratail.com/questions/367716?rss=all
|
前提・実現したいことpythonnbspのwaveモジュールを使ってwavデータのサンプル数を取得をしたいのですが、返す値がになってしまいます。 |
2021-11-04 15:25:23 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
flutter_hooks使用時、たまに'_element!.dirty': Bad stateというエラーが発生する。 |
https://teratail.com/questions/367715?rss=all
|
TextFieldに本のタイトルを入力すると、候補となる本のデータをAPIから取得し、モデル化した後で本の名前を一覧表示するという処理を書いています。 |
2021-11-04 15:24:16 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
WordPressのカスタム投稿のアーカイブページで記事をカテゴリーで絞り込みしたい |
https://teratail.com/questions/367714?rss=all
|
WordPressのカスタム投稿のアーカイブページで記事をカテゴリーで絞り込みしたいWordPressのカスタム投稿のアーカイブページで記事をカテゴリーで絞り込みできるボタンを作成したいです。 |
2021-11-04 15:18:48 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
いいね機能の非同期が実現せず困っています。 |
https://teratail.com/questions/367713?rss=all
|
前提・実現したいこと投稿アプリケーションの、非同期でのいいね機能の実装です。 |
2021-11-04 15:01:47 |
Program |
[全てのタグ]の新着質問一覧|teratail(テラテイル) |
内部マップをMap<I,O>から、Map<I,Myfuture<O>>に改良するときにエラーをなくしたい |
https://teratail.com/questions/367712?rss=all
|
内部マップをMapltIOgtから、MapltIMyfutureltOgtgtに改良するときにエラーをなくしたい以下のように、Mapではなく、Mapであるように改良と、Main中で、改良前後の実行時間をそれぞれ表示することで改良されているかを確認するプログラムを実行しました。 |
2021-11-04 15:01:18 |
Ruby |
Rubyタグが付けられた新着投稿 - Qiita |
ruby インスタンス変数 |
https://qiita.com/TatsuyaIsamu/items/6f7f89a5165983887b6a
|
rubyインスタンス変数railsで当たり前のように扱っているインスタンス変数上記の自動販売機プログラムで改めて考えたclassVendingMachineMONEYfreezedefinitializeslotmoneyenddefcurrentslotmoneyslotmoneyenddefslotmoneymoneyreturnfalseunlessMONEYincludemoneyslotmoneymoneyenddefreturnmoneyputsslotmoneyslotmoneyendend上見たいな自販機プログラムを書くその上でirbでgtgtrequireUsersisamutatsuyaworkspacevendingmachinevmrbgttruegtgtaVendingMachinenewgtltVendingMachinexfcedeslotmoneygtここでirb上でslotmoneyを出力するとgtgtslotmoneygtnilとなる。 |
2021-11-04 15:59:21 |
Ruby |
Rubyタグが付けられた新着投稿 - Qiita |
PG ConnectionBad エラー |
https://qiita.com/TatsuyaIsamu/items/8df277ba2fd9450e4c4b
|
ervicesrestartpostgresql |
2021-11-04 15:14:56 |
Ruby |
Rubyタグが付けられた新着投稿 - Qiita |
Google map の導入 |
https://qiita.com/TatsuyaIsamu/items/05c84bb1ace6f2d4b39c
|
Googlemapの導入RubyonRailsのアプリにGooglemapを導入するためにはGoogleのAPIとRubyのgemを使う必要がある導入にあたって抑えとく部分のまとめを開発環境rubyRubyonRailsGoogleの提供するAPI・Googleマップを使うにあたってGooglemapのAPIとgeocoderのAPIを所得する必要あり・GooglemapのAPIはGooglemapを使用するもの・geocoderのAPIは経度や緯度から目的地を算出するAPI逆もあり・参考にさせていただいた記事以下・上の記事を参考にするとこんなかんじでGooglemapをつくれる・抑えておくべきことはGooglemapと検索機能自体はgemを使わなくても導入できるとうことRubyの提供するgem・geocoderというgemとgmapsrailsというgemを使う・geocoderはGoogleのAPIとDB連携させるgem・要するにgemを使わなくてもGooglemapは使えるけど地図の情報を保存したり、住所が追加されたタイミングでコールバックさせてGooglemapを更新したりするために必要・gmapsrailsは簡単にGooglemapを表示するたまのgem・要するにgemを使わなくても頑張ればGooglemapを表示できるけどもgmapsrailsを使えば簡単に設定できるということ・それぞれの一次情報は以下・導入にあたって参考にさせていただいた記事は以下・上の記事を参考にするとこんなかんじでDBに保存したタイミングで地図が更新されるまとめ地図の導入は情報が非常に多いただ一方で本当に正しいのかって思う記事も多かった。 |
2021-11-04 15:10:36 |
Ruby |
Rubyタグが付けられた新着投稿 - Qiita |
スクレイピング ページ遷移しながらデータ収集 |
https://qiita.com/TatsuyaIsamu/items/bc0b35562ba74ff4b705
|
|
2021-11-04 15:02:40 |
Docker |
dockerタグが付けられた新着投稿 - Qiita |
DockerでAWSのEC2上に分析環境を作る |
https://qiita.com/nessyyamamoto/items/245a2b5b16f935e87b85
|
・コンテナ作成までの流れ実際に運用するのは作成したコンテナ中ということになりますが、他者と共有するのはそれを作成する元となる、DockerimageやDockerfileになります。 |
2021-11-04 15:25:44 |
Ruby |
Railsタグが付けられた新着投稿 - Qiita |
ruby インスタンス変数 |
https://qiita.com/TatsuyaIsamu/items/6f7f89a5165983887b6a
|
rubyインスタンス変数railsで当たり前のように扱っているインスタンス変数上記の自動販売機プログラムで改めて考えたclassVendingMachineMONEYfreezedefinitializeslotmoneyenddefcurrentslotmoneyslotmoneyenddefslotmoneymoneyreturnfalseunlessMONEYincludemoneyslotmoneymoneyenddefreturnmoneyputsslotmoneyslotmoneyendend上見たいな自販機プログラムを書くその上でirbでgtgtrequireUsersisamutatsuyaworkspacevendingmachinevmrbgttruegtgtaVendingMachinenewgtltVendingMachinexfcedeslotmoneygtここでirb上でslotmoneyを出力するとgtgtslotmoneygtnilとなる。 |
2021-11-04 15:59:21 |
Ruby |
Railsタグが付けられた新着投稿 - Qiita |
PG ConnectionBad エラー |
https://qiita.com/TatsuyaIsamu/items/8df277ba2fd9450e4c4b
|
ervicesrestartpostgresql |
2021-11-04 15:14:56 |
Ruby |
Railsタグが付けられた新着投稿 - Qiita |
Google map の導入 |
https://qiita.com/TatsuyaIsamu/items/05c84bb1ace6f2d4b39c
|
Googlemapの導入RubyonRailsのアプリにGooglemapを導入するためにはGoogleのAPIとRubyのgemを使う必要がある導入にあたって抑えとく部分のまとめを開発環境rubyRubyonRailsGoogleの提供するAPI・Googleマップを使うにあたってGooglemapのAPIとgeocoderのAPIを所得する必要あり・GooglemapのAPIはGooglemapを使用するもの・geocoderのAPIは経度や緯度から目的地を算出するAPI逆もあり・参考にさせていただいた記事以下・上の記事を参考にするとこんなかんじでGooglemapをつくれる・抑えておくべきことはGooglemapと検索機能自体はgemを使わなくても導入できるとうことRubyの提供するgem・geocoderというgemとgmapsrailsというgemを使う・geocoderはGoogleのAPIとDB連携させるgem・要するにgemを使わなくてもGooglemapは使えるけど地図の情報を保存したり、住所が追加されたタイミングでコールバックさせてGooglemapを更新したりするために必要・gmapsrailsは簡単にGooglemapを表示するたまのgem・要するにgemを使わなくても頑張ればGooglemapを表示できるけどもgmapsrailsを使えば簡単に設定できるということ・それぞれの一次情報は以下・導入にあたって参考にさせていただいた記事は以下・上の記事を参考にするとこんなかんじでDBに保存したタイミングで地図が更新されるまとめ地図の導入は情報が非常に多いただ一方で本当に正しいのかって思う記事も多かった。 |
2021-11-04 15:10:36 |
技術ブログ |
Developers.IO |
Aqua を使ってLambdaプログラム内の脆弱性と機密情報(アクセスキーや秘密鍵)のチェックをしてみた |
https://dev.classmethod.jp/articles/aqua-function-setup/
|
aquaplatform |
2021-11-04 06:06:49 |
海外TECH |
DEV Community |
Big Data Analytics Options on AWS | AWS White Paper Summary |
https://dev.to/awsmenacommunity/big-data-analytics-options-on-aws-aws-white-paper-summary-59l2
|
Big Data Analytics Options on AWS AWS White Paper Summary IntroductionAs the world becomes more digital the amount of data created and collected constantly grows and accelerates Analysis of this ever growing data becomes a challenge with traditional analytical tools Big data tools and technologies offer opportunities to analyze data efficiently so you can better understand customer preferences gain a competitive advantage in the marketplace and grow your business AWS provides a broad platform of managed services to help you build secure and seamlessly scale end to end big data applications quickly and with ease The AWS advantage in big data analyticsAnalyzing large datasets requires significant compute capacity that can vary in size based on the amount of input data and the type of analysis For mission critical applications on a more traditional infrastructure system designers have no choice but to over provision because a surge in additional data due to an increase in business needs must be something the system can handle on AWS you can provision more capacity and compute in a matter of minutes meaning that your big data applications grow and shrink as demand dictates and your system runs as close to optimal efficiency as possible In addition you get flexible computing on a global infrastructure with access to the many different geographic Regions that AWS offers along with the ability to use other scalable services that augment to build sophisticated big data applications These other services include Amazon Simple Storage Service Amazon S to store dataAWS Glue to orchestrate jobs to move and transform the data easilyAWS IoT which lets connected devices interact with cloud applications and other connected devicesAs the amount of data being generated continues to grow AWS has many options to get that data to the cloud including secure devices like AWS Snow Family to accelerate petabyte scale data transfers delivery streams with Amazon Kinesis Data Firehose to load streaming data continuously migrating databases using AWS Database Migration Service and scalable private connections through AWS Direct Connect As mobile continues to rapidly grow in usage you can use the suite of services within the AWS Mobile Hub to collect and measure app usage and data or export that data to another service for further custom analysis These capabilities of AWS make it an ideal fit for solving big data problems and many customers have implemented successful big data analytics workloads on AWS The following services for collecting processing storing and analyzing big data are described in order Amazon KinesisAmazon Kinesis is a platform for streaming data on AWS that makes it easy to load and analyze streaming data Amazon Kinesis also enables you to build custom streaming data applications for specialized needs With Kinesis you can ingest real time data such as application logs website clickstreams Internet of Things IoT telemetry data and more into your databases data lakes and data warehouses or build your own real time applications using this data Amazon Kinesis enables you to process and analyze data as it arrives and respond in real time instead of having to wait until all your data is collected before the processing can begin Currently there are four pieces of the Kinesis platform that can be utilized based on your use case Amazon Kinesis Data Streams enables you to build custom applications that process or analyze streaming data Amazon Kinesis Video Streams enables you to build custom applications that process or analyze streaming video Amazon Kinesis Data Firehose enables you to deliver real time streaming data to AWS destinations such as such as Amazon S Amazon Redshift OpenSearch Service and Splunk Amazon Kinesis Data Analytics enables you to process and analyze streaming data with standard SQL or with Java managed Apache Flink Amazon Managed Streaming for Apache Kafka Amazon MSK Amazon MSK is a fully managed service that makes it easy for you to build and run applications that use Apache Kafka to process streaming data Apache Kafka is an open source platform for building real time streaming data pipelines and applications With Amazon MSK you can use native Apache Kafka APIs to populate data lakes stream changes to and from databases and power machine learning and analytics applications AWS LambdaAWS Lambda enables you to run code without provisioning or managing servers You pay only for the compute time you consume there is no charge when your code is not running With Lambda you can run code for virtually any type of application or backend service all with zero administration Just upload your code and Lambda takes care of everything required to run and scale your code with high availability You can set up your code to automatically trigger from other AWS services or call it directly from any web or mobile app Amazon Elastic Map Reduce Amazon EMR Amazon EMR is the industry leading cloud big data platform for processing vast amounts of data using open source tools such as Apache Spark Apache Hive Apache HBase Apache Flink Apache Hudi and Presto Amazon EMR makes it easy to set up operate and scale your big data environments by automating time consuming tasks like provisioning capacity and tuning clusters With EMR you can run petabyte scale analysis at less than half of the cost of traditional on premises solutions and over x faster than standard Apache Spark Amazon EMR does all the work involved with provisioning managing and maintaining the infrastructure and software of a Hadoop cluster AWS GlueAWS Glue is a serverless data integration service that makes it easy to discover prepare and combine data for analytics machine learning and application development AWS Glue provides all of the capabilities needed for data integration It both visual and code based interfaces to make data integration easier AWS Lake FormationAWS Lake Formation is an integrated data lake service that makes it easy for you to ingest clean catalog transform and secure your data and make it available for analysis and machine learning Lake Formation gives you a central console where you can discover data sources set up transformation jobs to move data to an S data lake remove duplicates and match records catalog data for access by analytic tools configure data access and security policies and audit and control access from AWS analytic and machine learning services Amazon Machine LearningAWS offers the broadest and deepest set of machine learning services and supporting cloud infrastructure putting machine learning in the hands of every developer data scientist and expert practitioner When you build an ML based workload in AWS you can choose from three different levels of ML services to balance speed to market with level of customization and ML skill level Artificial Intelligence AI servicesML servicesML frameworks and infrastructure Amazon DynamoDBAmazon DynamoDB is a fast fully managed NoSQL database service that makes it simple and cost effective to store and retrieve any amount of data and serve any level of request traffic DynamoDB helps offload the administrative burden of operating and scaling a highly available distributed database cluster This storage alternative meets the latency and throughput requirements of highly demanding applications by providing single digit millisecond latency and predictable performance with seamless throughput and storage scalability Amazon RedshiftAmazon Redshift is a fast fully managed petabyte scale data warehouse service that makes it simple and cost effective to analyze all your data efficiently using your existing business intelligence tools It is optimized for data sets ranging from a few hundred gigabytes to a petabyte or more and is designed to cost less than a tenth of the cost of most traditional data warehousing solutions Amazon OpenSearch Service OpenSearch Service Amazon OpenSearch Service OpenSearch Service makes it easy to deploy operate and scale OpenSearch Service for log analytics full text search application monitoring and more OpenSearch Service is a fully managed service that delivers the OpenSearch Service easy to use APIs and real time capabilities along with the availability scalability and security required by production workloads The service offers built in integrations with OpenSearch Dashboards Logstash and AWS services including Amazon Kinesis Data Firehose AWS Lambda and Amazon CloudWatch so that you can go from raw data to actionable insights quickly Amazon QuickSightAmazon QuickSight is a scalable serverless embeddable machine learning powered business intelligence BI service built for the cloud It makes it easy for all employees within an organization to build visualizations perform ad hoc analysis and quickly get business insights fromtheir data anytime on any device Amazon QuickSight enables organizations to scale their business analytics capabilities to hundreds of thousands of users and delivers fast and responsive query performance by using a robust in memory engine SPICE Amazon Compute Services Amazon Elastic Compute Cloud Amazon EC instances Amazon Elastic Container Service Amazon ECS and Amazon Elastic Kubernetes Service Amazon EKS are available for self managed big data applications Amazon EC with instances acting as AWS virtual machines provides an ideal platform for operating your own self managed big data analytics applications on AWS infrastructure Almost any software you can install on Linux or Windows virtualized environments can be run on Amazon EC and you can use the pay as you go pricing model Amazon AthenaAmazon Athena is an interactive query service that makes it easy to analyze data in Amazon S using standard SQL Athena is serverless so there is no infrastructure to setup or manage and you can start analyzing data immediately You don t need to load your data into Athena as it works directly with data stored in S Just log into the Athena Console define your table schema and start querying Amazon Athena uses Presto with full ANSI SQL support and works with a variety of standard data formats including CSV JSON ORC Apache Parquet and Apache Avro Solving big data problems on AWS Example Queries against an Amazon S data lakeData lakes are an increasingly popular way to store and analyze both structured and unstructured data If you use an Amazon S data lake AWS Glue can make all your data immediately available for analytics without moving the data AWS Glue crawlers can scan your data lake and keep the AWS Glue Data Catalog in sync with the underlying data You can then directly query your data lake with Amazon Athena and Amazon Redshift Spectrum You can also use the AWS Glue Data Catalog as your external Apache Hive Metastore for big data applications running on Amazon EMR Example Capturing and analyzing sensor dataThe process begins with each A C unit providing a constant data stream to Amazon Kinesis Data Streams Using the Amazon Kinesis Data Streams provided tools such as the Kinesis Client Library or SDK a simple application is built on Amazon EC to read data as it comes into Amazon Kinesis Data Streams analyze it and determine if the data warrants an update to the real time dashboard This data flow needs to occur in near real time so that customers and maintenance teams can be alerted quickly if there is an issue with the unit Additionally there will be lots of potential access to this data from the following sources Customers checking on their system via a mobile device or browserMaintenance teams checking the status of its fleetData and intelligence algorithms and analytics in the reporting platform spot trends that can be then sent out as alerts such as if the A C fan has been running unusually long with the building temperature not going down DynamoDB was chosen to store this near real time data set because it is both highly available and scalable throughput to this data can be easily scaled up or down to meet the needs of its consumers as the platform is adopted and usage grows The reporting dashboard is a custom web application that is built on top of this data set and run on Amazon EC It provides content based on the system status and trends as well as alerting customers and maintenance crews of any issues that may come up with the unit The customer accesses the data from a mobile device or a web browser to get the current status of the system and visualize historical trends To read from the Amazon Kinesis stream there is a separate Amazon Kinesis enabled application that probably runs on a smaller EC instance that scales at a slower rate The data is transformed by the Amazon Kinesis enabled application into a format that is suitable for long term storage for loading into its data warehouse and storing on Amazon S Amazon Redshift is again used as the data warehouse for the larger data set For visualizing the analytics one of the many partner visualization platforms can be used via the OBDC JDBC connection to Amazon Redshift Example sentiment analysis of social mediaFirst deploy an Amazon EC instance in an Amazon VPC that ingests tweets from Twitter Next create an Amazon Kinesis Data Firehose delivery stream that loads the streaming tweets into the raw prefix in the solution s S bucket S invokes a Lambda function to analyze the raw tweets using Amazon Translate to translate non English tweets into English and Amazon Comprehend to use natural language processing NLP to perform entity extraction and sentiment analysis A second Kinesis Data Firehose delivery stream loads the translated tweets and sentiment values into the sentiment prefix in the S bucket A third delivery stream loads entities in the entities prefix in the S bucket This architecture also deploys a data lake that includes AWS Glue for data transformation Amazon Athena for data analysis and Amazon QuickSight for data visualization AWS Glue Data Catalog contains a logical database used to organize the tables for the data in S Athena uses these table definitions to query the data stored in S and return the information to an Amazon QuickSight dashboard ConclusionAs more and more data is generated and collected organizations are facing a growing big data ecosystem where new tools emerge and become outdated very quickly With a broad set of managed services to collect process and analyze big data AWS makes it easier to build deploy and scale big data applications This enables you to focus on business problems instead of updating and managing these tools AWS provides many solutions to address your big data analytic requirements Most big data architecture solutions use multiple AWS tools to build a complete solution The result is a flexible big data architecture that is able to scale along with your business ReferenceOriginal paper |
2021-11-04 06:36:03 |
海外TECH |
DEV Community |
Writing your first Client-Server Program |
https://dev.to/iabdsam/writing-your-first-client-server-program-5a0b
|
Writing your first Client Server ProgramYour Professor asked you to prepare a basic client server program for the lab next week or maybe you are yourself starting Socket programming what s better than writing the first program and learning along that The source files are located on Github if you need them The quick theory what is a socket well its an integer and a file descriptor through which we do our desired communication what does communication looks like important Server Bind s socket to an address and port this socket then listen s for incoming then accept s the incoming request from client Client Connect s your socket via local address and port for a listen ing socket to accept what happens after connect and accept accept returns a new socket to be used for communicating to that particular accepted client In client side the same socket used to connect is used to communicate now you can send and recv from both sides What is the first thing that you need Headers We will be using the following include lt stdio h gt include lt string h gt include lt unistd h gt include lt sys socket h gt include lt arpa inet h gt Lets build both sides one by one Server cint main int sockID socket PF INET SOCK STREAM A stream socket has been created in PF INET internet domain Now we have to bind our socket to an address and port But the bind function asks for one struct of sockaddr type that has all that info Instead we pass it sockaddr in casted to sockaddr because it is more specific to our needs struct sockaddr in addrPort addrPort sin family AF INET sets IPvaddrPort sin addr s addr htonl INADDR ANY addrPort sin port htons htonl host to network long htons host to network shortWe filled the structure Here INADDR ANY automatically fills a default Ip address since we don t want to bind it to particular one is the port number that we will be using You can use your own make sure it is not reserved use the same in client if bind sockID struct sockaddr amp addrPort sizeof addrPort printf nBind Failed to Port n n else printf nBind Success to Port n n Binded the socket returns if fails if listen sockID printf Listen Failed n n else printf Listening n n Now we are ready to accept an incoming request connect from client struct sockaddr in client addr socklen t len sizeof client addr int sockID accept sockID struct sockaddr amp client addr amp len printf Accepted a connection n This is interesting The only thing accept needs from our server is sockID client addr will contain info about the client it accepts and will be filled on function call This returns a new socket sockID for us which will be used for sending and recieving from the client The last step char msg int countR countS while strcmp msg Close n printf To Client fgets msg stdin countS send sockID msg if strcmp msg Close n break printf From Client countR recv sockID msg printf s msg Try to understand this yourself countR and countS are there for error handling send and recv give on error Sending or recieving the word Close is the exit condition What are we trying to do First message to be sent by server Client then recieves it and sends something Server then recieves it and sends Finally close sockID close sockID printf nSocket connection Closed n return Without scratching heads lets move to client c Client cint main int sockID socket PF INET SOCK STREAM struct sockaddr in addrPort addrPort sin family AF INET sets to IPv addrPort sin addr s addr htonl INADDR ANY addrPort sin port htons htonl host to network long htons host to network shortThis is same because connection also requires the specifics of where and via what to connect if connect sockID struct sockaddr amp addrPort sizeof addrPort printf nConnect Failed n n else printf nConnected via Port n n Nearly identical to bind Right But remember it complements the accept function in the server side char msg int countR countS while strcmp msg Close n printf From Server countR recv sockID msg printf s msg if strcmp msg Close n break printf To Server fgets msg stdin countS send sockID msg I hope its easier to see the logic now close sockID printf nSocket connection Closed n return Done Move to the directory Split your terminal In one run the server then on other run the client Happy connection Source codeI have tried to keep it as short as possible and only explain the things relevant to a first program Questions amp Suggestions are welcome |
2021-11-04 06:32:28 |
海外TECH |
DEV Community |
JavaScript Loose Equality vs Strict Equality check |
https://dev.to/swastikyadav/javascript-loose-equality-vs-strict-equality-check-5k2
|
JavaScript Loose Equality vs Strict Equality checkHello Everyone In this post we will explore the difference between JS loose equality and strict equality check Here is the simplest definitionLoose equality checks for value only Strict equality checks for value as well as DataType But wait there is something more to it Let s understand the workings of both of them one by one Strict Equality Strict equality first checks for DataType If datatype is same then it checks for value else it returns false Ex console log false Because datatype is different even though value is same Loose Equality Loose equality works similar to strict equality The only difference is that in loose equality if datatype is different it performs an Implicit type conversion and then compares the value Ex console log true Because implicit conversion will change string to number then compare value If you enjoyed or found this post helpful please consider joining my weekly Newsletter below Thank You for reading I am starting a NewsLetter where I will share epic content on building your skillset So if this sounds interesting to you subscribe here |
2021-11-04 06:10:40 |
海外TECH |
Engadget |
HBO Max and Discovery+ might merge into a single platform |
https://www.engadget.com/hbo-max-discovery-merge-single-platform-061538436.html?src=rss
|
HBO Max and Discovery might merge into a single platformBack in May AT amp T spun off its WarnerMedia division and merged it with Discovery in a billion agreement The deal is on track to close by mid after which we may see its streaming services become a combined offering to subscribers According to Gizmodo president and CEO of Discovery Streaming and International JB Perrette has discussed the steps the company may take to reach that goal Initially Warner Bros Discovery as the merged company will be called may offer HBO Max and Discovery as a bundle In the next phase of the plan the company may merge the two streaming services into one platform FierceVideo said Perette described both streaming services as an quot incredibly attractive tech buffet quot He expects the new company to take the best parts from both to create a new platform as there will be quot meaningful cost savings quot and quot meaningful consumer benefits quot from combining the two services Discovery president and CEO David Zaslav also revealed during the earnings call that less than half of Discovery subscribers in the US are also subscribed to HBO Max He said that with the right packaging the fact that the overlap in subscribers isn t too big quot provides a real opportunity to broaden the base of the combined offering quot When and where the joint HBO Max and Discovery service will initially be available remains to be seen Discovery said it might be easier for the company to combine its streaming service with HBO Max in regions where Discovery isn t available yet At the moment it s being very selective when it comes to rolling out Discovery to new markets to minimize the need to re platform two streaming services in the future nbsp |
2021-11-04 06:15:38 |
医療系 |
医療介護 CBnews |
ポリファーマシー地域連携で解消、モデル事業へ-22年度に実施、厚労省 |
https://www.cbnews.jp/news/entry/20211104153220
|
厚生労働省 |
2021-11-04 15:45:00 |
医療系 |
医療介護 CBnews |
20年度介護保険費用は10兆7,783億円で過去最高を更新-厚労省・介護給付費等実態統計 |
https://www.cbnews.jp/news/entry/20211104144746
|
介護予防 |
2021-11-04 15:15:00 |
金融 |
JPX マーケットニュース |
[東証]新規上場の承認(マザーズ):(株)フレクト |
https://www.jpx.co.jp/listing/stocks/new/index.html
|
新規上場 |
2021-11-04 15:30:00 |
金融 |
JPX マーケットニュース |
[東証]制限値幅の拡大継続:1銘柄 |
https://www.jpx.co.jp/news/1030/20211104-02.html
|
継続 |
2021-11-04 15:15:00 |
金融 |
JPX マーケットニュース |
[東証]制限値幅の拡大:1銘柄 |
https://www.jpx.co.jp/news/1030/20211104-01.html
|
東証 |
2021-11-04 15:15:00 |
ニュース |
ジェトロ ビジネスニュース(通商弘報) |
プーチン大統領、最大限の森林保全策実施を表明 |
https://www.jetro.go.jp/biznews/2021/11/9f747bfb9e3c13a7.html
|
森林 |
2021-11-04 06:50:00 |
ニュース |
ジェトロ ビジネスニュース(通商弘報) |
米環境保護庁、石油・天然ガス産業が排出するメタンガスの削減規則案を発表 |
https://www.jetro.go.jp/biznews/2021/11/b219bb5887c09acb.html
|
天然ガス |
2021-11-04 06:10:00 |
ニュース |
BBC News - Home |
Musket balls unearthed in digs at Culloden Battlefield |
https://www.bbc.co.uk/news/uk-scotland-highlands-islands-59133560?at_medium=RSS&at_campaign=KARANGA
|
battlefield |
2021-11-04 06:07:51 |
ニュース |
BBC News - Home |
Chelsea 5-0 Everton: Was this peak Antonio Conte in his time in the Premier League? |
https://www.bbc.co.uk/sport/av/football/59154237?at_medium=RSS&at_campaign=KARANGA
|
Chelsea Everton Was this peak Antonio Conte in his time in the Premier League Watch a performance described by commentator John Motson as one of the best he had ever seen as Antonio Conte s Chelsea destroy Everton in their Premier League title winning season |
2021-11-04 06:23:37 |
ニュース |
BBC News - Home |
Climate change: Olympic snowboarding champion Shaun White discusses global warming |
https://www.bbc.co.uk/sport/av/winter-sports/59096039?at_medium=RSS&at_campaign=KARANGA
|
Climate change Olympic snowboarding champion Shaun White discusses global warmingThree time Olympic snowboarding champion Shaun White says it will be interesting to see what happens to winter sports if global warming continues |
2021-11-04 06:15:41 |
ビジネス |
ダイヤモンド・オンライン - 新着記事 |
【寄稿】AI時代に人間らしくあるために=キッシンジャー氏他 - WSJ発 |
https://diamond.jp/articles/-/286690
|
時代 |
2021-11-04 15:12:00 |
LifeHuck |
ライフハッカー[日本版] |
Apple Notesを使ってiPhoneやiPadで共同作業する方法 |
https://www.lifehacker.jp/2021/11/how-to-collaborate-with-apple-notes.html
|
apple |
2021-11-04 16:00:00 |
北海道 |
北海道新聞 |
岡田龍生氏、母校の監督就任へ 東洋大姫路、来年4月から |
https://www.hokkaido-np.co.jp/article/607867/
|
岡田龍生 |
2021-11-04 15:11:00 |
北海道 |
北海道新聞 |
飲食店制限解除 感染防止意識緩めずに |
https://www.hokkaido-np.co.jp/article/607665/
|
新型コロナウイルス |
2021-11-04 15:06:01 |
北海道 |
北海道新聞 |
岸田首相初外遊 緊張緩和へ対話重視を |
https://www.hokkaido-np.co.jp/article/607664/
|
岸田文雄 |
2021-11-04 15:05:35 |
北海道 |
北海道新聞 |
ロッテ佐々木朗希、CS初戦先発 2戦目は小島和哉 |
https://www.hokkaido-np.co.jp/article/607866/
|
井口資仁 |
2021-11-04 15:01:00 |
IT |
週刊アスキー |
ドミノ話題の「ピザライスボウル」が2つめ200円!ランチタイムまとめ買いキャンペーン |
https://weekly.ascii.jp/elem/000/004/074/4074083/
|
利用可能 |
2021-11-04 15:30:00 |
IT |
週刊アスキー |
『リネージュ2M』でダンジョンの基本時間が2倍となる「ダンジョン プラスタイム!」を開催! |
https://weekly.ascii.jp/elem/000/004/074/4074127/
|
ncsoft |
2021-11-04 15:15:00 |
IT |
週刊アスキー |
秋の味覚“栗”がメイン! 横浜ベイホテル東急「栗アフタヌーンティー」、11月30日まで |
https://weekly.ascii.jp/elem/000/004/074/4074121/
|
期間限定 |
2021-11-04 15:10:00 |
マーケティング |
AdverTimes |
年末恒例企画「企業の広報活動に関する調査2021」ご協力のお願い/月刊『広報会議』より |
https://www.advertimes.com/20211104/article367738/
|
事業会社 |
2021-11-04 06:10:33 |
コメント
コメントを投稿