Week Ending 04.28.19
RESEARCH WATCH: 04.28.19
Over the past week, 191 new papers were published in "Computer Science".
The paper discussed most in the news over the past week was "Flash Boys 2.0: Frontrunning, Transaction Reordering, and Consensus Instability in Decentralized Exchanges"by Philip Daian et al (Apr 2019), which was referenced 26 times, including in the article The Ledger: Binance’s Secret Sauce, Fake Satoshi, Waiting on Bakkt in NewsPuddle.com. The paper author, Ariel Juels, was quoted saying "This should incentivize the community to consider new exchange designs". The paper got social media traction with 386 shares. A user, @Miles_Kellerman, tweeted "Interesting paper by et al. on front-running in dig. currencies: Can confirm from my field research that dig. curr. exchs VERY slow to set up surveillance. also find only 50% of top 10 have surveillance in place - shockingly low".
Leading researcher Ruslan Salakhutdinov (Carnegie Mellon University) came out with "On Exact Computation with an Infinitely Wide Neural Net".
The paper shared the most on social media this week is by a team at Amazon: "Language Models with Transformers" by Chenguang Wang et al (Apr 2019) with 144 shares. @hardmaru (hardmaru) tweeted "“Experimental results on the PTB, WikiText-2, and WikiText-103 show that our method achieves perplexities between 20 and 34 on all problems, i.e. on average an improvement of 12 perplexity units compared to state-of-the-art LSTMs.” 🔥".
Over the past week, 76 new papers were published in "Computer Science - Artificial Intelligence".
The paper discussed most in the news over the past week was "Integrating Social Media into a Pan-European Flood Awareness System: A Multilingual Approach" by V. Lorini(European Commission, Joint Research Centre) et al (Apr 2019), which was referenced 8 times, including in the article AI uses tweets to help researchers analyze floods in Venturebeat. The paper got social media traction with 59 shares. The authors describe a prototype system that integrates social media analysis into the European Flood Awareness System (EFAS). A Twitter user, @valeriolorini, posted "Social Media for Flood Risk to #EFAS #GloFAS. In we describe why IR from Social Media is crucial for Natural Hazard DRR. SMFR is multilingual , done with #Convnet paper accepted as CoRe #floodrisk #CopernicusEMS".
Leading researcher Ruslan Salakhutdinov (Carnegie Mellon University) published "The MineRL Competition on Sample Efficient Reinforcement Learning using Human Priors"@gastronomy tweeted "> Though deep reinforcement learning has led to breakthroughs in many difficult domains, these successes have required an ever-i".
The paper shared the most on social media this week is by a team at Amazon: "Language Models with Transformers" by Chenguang Wang et al (Apr 2019) with 144 shares. @hardmaru (hardmaru) tweeted "“Experimental results on the PTB, WikiText-2, and WikiText-103 show that our method achieves perplexities between 20 and 34 on all problems, i.e. on average an improvement of 12 perplexity units compared to state-of-the-art LSTMs.” 🔥".
This week was active for "Computer Science - Computer Vision and Pattern Recognition", with 203 new papers.
The paper discussed most in the news over the past week was "Fooling automated surveillance cameras: adversarial patches to attack person detection" by Simen Thys et al (Apr 2019), which was referenced 34 times, including in the article Trump backs Harley Davidson on EU trade tariffs in BBC. The paper author, Wiebe Van Ranst, was quoted saying "The idea behind this work is to be able to circumvent security systems that use a person detector to generate an alarm when a person enters the view of a camera". The paper also got the most social media traction with 13746 shares. On Twitter, @pwang posted "This is going to be a major Tshirt trend over the next couple of years: Innocent-looking shirts with various patterns that are specifically designed to trick neural networks. Adversarial hats will also be a thing, for face detectors".
Leading researcher Ruslan Salakhutdinov (Carnegie Mellon University) published "On Exact Computation with an Infinitely Wide Neural Net".
The paper shared the most on social media this week is by a team at Google: "Attention Augmented Convolutional Networks" by Irwan Bello et al (Apr 2019) with 188 shares. The authors consider the use of self - attention for discriminative visual tasks as an alternative to convolutions. @ronnieclark__ (Ronnie Clark) tweeted "Concatenating convolutional channels with other feature maps seems to be quite popular these days - coordinate maps in Coord-Conv, camera intrinsic maps in Cam-Conv and self attention maps in this work by et al".
Over the past week, 17 new papers were published in "Computer Science - Computers and Society".
The paper discussed most in the news over the past week was "#Cyberbullying in the digital age: Peoples perspective and information sharing behavior on Twitter" by Iman Tahamtan et al (Apr 2019), which was referenced 1 time, including in the article Research Paper: “#Cyberbullying in the Digital Age: People’s Perspective and Information Sharing Behavior on Twitter” (Preprint) in Library Journal. The paper was shared 3 times in social media.
Over the past week, 17 new papers were published in "Computer Science - Human-Computer Interaction".
Over the past week, 162 new papers were published in "Computer Science - Learning".
The paper discussed most in the news over the past week was "Generating Long Sequences with Sparse Transformers" by Rewon Child et al (Apr 2019), which was referenced 3 times, including in the article OpenAI researchers have developed Sparse Transformers, a neural network which can predict what comes next in a sequence in Packt. The paper got social media traction with 10 shares. A Twitter user, @yapp1e, said "Generating Long Sequences with Sparse Transformers. Transformers are powerful sequence models, but require time and memory that grows quadratically with the sequence length. In this paper we introduce sparse factorizations o".
Leading researcher Yoshua Bengio (Université de Montréal) published "Compositional generalization in a deep seq2seq model by separating syntax and semantics" @jaaanaru tweeted "Generalization by keeping syntactic & semantic information in separate streams.This simple trick from neuroscience boosts the compositional generalization score in SCAN dataset from 12.5% to 91%🤯.Deep learning people,look into the brain for inspiration!".
The paper shared the most on social media this week is by a team at Microsoft: "Local Relation Networks for Image Recognition" by Han Hu et al (Apr 2019) with 95 shares. @hardmaru (hardmaru) tweeted "Local Relation Networks for Image Recognition They show local relation layers can adaptively infer meaningful compositional structure among visual elements in a local area of an image, and try to completely replace convolution layers with LR in a ResNet".
Over the past week, six new papers were published in "Computer Science - Multiagent Systems".
Over the past week, 16 new papers were published in "Computer Science - Neural and Evolutionary Computing".
Leading researcher Ruslan Salakhutdinov (Carnegie Mellon University) published "On Exact Computation with an Infinitely Wide Neural Net".
Over the past week, 30 new papers were published in "Computer Science - Robotics".
The paper discussed most in the news over the past week was "Soft-bubble: A highly compliant dense geometry tactile sensor for robot manipulation" by Alex Alspach et al (Apr 2019), which was referenced 1 time, including in the article Video Friday: Massive Solar-Powered Drone, and More in Spectrum Online. The paper was shared 3 times in social media. On Twitter, @patmeansnoble said "Check out this cool paper that describes a hybrid tactile + depth image sensor (from Tedrake’s group): There is a link to the video as well in the paper. #SoftHaptics #ICP #NeuralNetworks #TactileSensing".