Week Ending 12.27.2020
RESEARCH WATCH: 12.27.2020
Over the past week, 851 new papers were published in "Computer Science".
The paper discussed most in the news over the past week was by a team at Ben-Gurion University of the Negev: "AIR-FI: Generating Covert Wi-Fi Signals from Air-Gapped Computers" by Mordechai Guri (Dec 2020), which was referenced 15 times, including in the article SECURITY RAM used as a Wi-Fi transmitter to leak data from air-gapped systems in TechSpot. The paper author, Mordechai Guri (Ben-Gurion University of the Negev), was quoted saying "AIR-FI: Generating Covert Wi-Fi Signals from Air-Gapped Computers". The paper got social media traction with 594 shares. The investigators show that attackers can exfiltrate data from air - gapped computers via Wi - Fi signals. A Twitter user, @SimonZerafa, observed "This Week in Exfil. Using DDR SDRAM buses to generate electromagnetic emissions in the 2.4 GHz Wi-Fi range ->", while @SimonZerafa commented "This Week in Exfil. Using DDR SDRAM buses to generate electromagnetic emissions in the 2.4 GHz Wi-Fi range ->".
Leading researcher Oriol Vinyals (DeepMind) published "Solving Mixed Integer Programs Using Neural Networks", which had 24 shares over the past 2 days. The researchers apply learning to the two key sub - tasks of a MIP solver, generating a high - quality joint variable assignment, and bounding the gap in objective value between that assignment and an optimal one. @popular_ML tweeted "The most popular ArXiv tweet in the last 24h".
The paper shared the most on social media this week is by a team at Sorbonne University: "Training data-efficient image transformers & distillation through attention" by Hugo Touvron et al (Dec 2020) with 247 shares. The authors produce a competitive convolution - free transformer by training on Imagenet only. @omarsar0 (elvis) tweeted "DeiT - Transformer-based image classification model built for high performance and requiring less compute & data. Uses distillation through attention and achieves 84.2 top-1 accuracy on the ImageNet benchmark trained on a single 8-GPU server over 3 days".
This week was very active for "Computer Science - Artificial Intelligence", with 147 new papers.
The paper discussed most in the news over the past week was by a team at DeepMind: "DeepMind Lab2D" by Charles Beattie et al (Nov 2020), which was referenced 4 times, including in the article Best of arXiv.org for AI, Machine Learning, and Deep Learning – November 2020 in InsideBIGDATA. The paper got social media traction with 153 shares. On Twitter, @DynamicWebPaige posted "👾 "We present Lab2D, a scalable environment simulator for multi-agent artificial intelligence research that facilitates researcher-led experimentation with environment design." This is giving me flashbacks to Number Munchers. 😄 👇 Learn more in the links below".
Leading researcher Oriol Vinyals (DeepMind) came out with "Solving Mixed Integer Programs Using Neural Networks", which had 24 shares over the past 2 days. The investigators apply learning to the two key sub - tasks of a MIP solver, generating a high - quality joint variable assignment, and bounding the gap in objective value between that assignment and an optimal one. @popular_ML tweeted "The most popular ArXiv tweet in the last 24h". This paper was also shared the most on social media with 222 tweets. @popular_ML (Popular ML resources) tweeted "The most popular ArXiv tweet in the last 24h".
This week was active for "Computer Science - Computer Vision and Pattern Recognition", with 229 new papers.
The paper discussed most in the news over the past week was by a team at DeepMind: "Object-based attention for spatio-temporal reasoning: Outperforming neuro-symbolic models with flexible distributed architectures" by David Ding et al (Dec 2020), which was referenced 3 times, including in the article DeepMind researchers claim neural networks can outperform neurosymbolic models in Venturebeat. The paper also got the most social media traction with 286 shares. A Twitter user, @DeepMind, posted "Can neural networks learn to perform explanatory & counterfactual reasoning? Researchers find that an object-centric transformer substantially outperforms leading neuro-symbolic models on two reasoning tasks thought to be challenging for deep neural nets".
Leading researcher Devi Parikh (Georgia Institute of Technology) published "KRISP: Integrating Implicit and Symbolic Knowledge for Open-Domain Knowledge-Based VQA" The investigators study open - domain knowledge, the setting when the knowledge required to answer a question is not given/annotated, neither at training nor test time.
The paper shared the most on social media this week is by a team at Sorbonne University: "Training data-efficient image transformers & distillation through attention" by Hugo Touvron et al (Dec 2020)
Over the past week, 18 new papers were published in "Computer Science - Computers and Society".
The paper discussed most in the news over the past week was by a team at Indiana University: "The COVID-19 Infodemic: Twitter versus Facebook" by Kai-Cheng Yang et al (Dec 2020), which was referenced 2 times, including in the article Studies show bots might not be the dominant driver of COVID-19 misinformation on social media in Venturebeat. The paper got social media traction with 9 shares. The authors analyze the prevalence and diffusion of links to low - credibility content about the pandemic across two major social media platforms, Twitter and Facebook. A Twitter user, @MagusNet, posted "The COVID-19 Infodemic: #Twitter versus #Facebook Comparing the two platforms, we find popular low-credibility sources and suspicious videos tend to be verified by the platforms".
Over the past week, 22 new papers were published in "Computer Science - Human-Computer Interaction".
This week was very active for "Computer Science - Learning", with 288 new papers.
The paper discussed most in the news over the past week was by a team at University of Cambridge: "Hey Alexa what did I just type? Decoding smartphone sounds with a voice assistant" by Almos Zarandy et al (Dec 2020), which was referenced 7 times, including in the article Eavesdropping on Phone Taps from Voice Assistants in Schneier on Security. The paper got social media traction with 27 shares. The researchers show that privacy threats go beyond spoken conversations and include sensitive data typed on nearby smartphones. A Twitter user, @TwitchiH, commented ""...can extract PIN codes and text messages from recordings collected by a voice assistant located up to half a meter away. This shows that remote keyboard-inference attacks are not limited to physical keyboards but extend to virtual keyboards too."".
Leading researcher Oriol Vinyals (DeepMind) came out with "Solving Mixed Integer Programs Using Neural Networks", which had 24 shares over the past 2 days. The investigators apply learning to the two key sub - tasks of a MIP solver, generating a high - quality joint variable assignment, and bounding the gap in objective value between that assignment and an optimal one.
Over the past week, 13 new papers were published in "Computer Science - Multiagent Systems".
The paper discussed most in the news over the past week was by a team at DeepMind: "Imitating Interactive Intelligence" by Josh Abramson et al (Dec 2020), which was referenced 1 time, including in the article Data Science Weekly Newsletter - Issue 401 (Dec 17, 2020) in Data Science Weekly. The paper also got the most social media traction with 236 shares. The researchers study how to design artificial agents that can interact naturally with humans using the simplification of a virtual environment. A Twitter user, @ArthurBrussee, said "My* first paper on arXiv! It describes the 'playroom' environment, collection of 2 years(!) of 'language games' data, improvements for BC, and human agent interaction. *Work mostly done before I joined, by a large incredible team that were kind enough to credit me anyway :)".
Over the past week, 14 new papers were published in "Computer Science - Neural and Evolutionary Computing".
Leading researcher Oriol Vinyals (DeepMind) published "Solving Mixed Integer Programs Using Neural Networks", which had 24 shares over the past 2 days. The investigators apply learning to the two key sub - tasks of a MIP solver, generating a high - quality joint variable assignment, and bounding the gap in objective value between that assignment and an optimal one. @popular_ML tweeted "The most popular ArXiv tweet in the last 24h". This paper was also shared the most on social media with 222 tweets. @popular_ML (Popular ML resources) tweeted "The most popular ArXiv tweet in the last 24h".
This week was active for "Computer Science - Robotics", with 56 new papers.
The paper discussed most in the news over the past week was by a team at Zhejiang University: "EGO-Swarm: A Fully Autonomous and Decentralized Quadrotor Swarm System in Cluttered Environments" by Xin Zhou et al (Nov 2020), which was referenced 4 times, including in the article You can be my wingbot any time – US military successfully runs AI system on spy plane in The Register. The paper was shared 4 times in social media. The authors present a decentralized and asynchronous systematic solution for multi - robot autonomous navigation in unknown obstacle - rich scenes using merely onboard resources.
The paper shared the most on social media this week is by a team at University of California, Davis: "YolactEdge: Real-time Instance Segmentation on the Edge (Jetson AGX Xavier: 30 FPS, RTX 2080 Ti: 170 FPS)" by Haotian Liu et al (Dec 2020) with 87 shares.