Eye On AI

View Original

Week Ending 08.11.19

RESEARCH WATCH: 08.11.19

Over the past week, 600 new papers were published in "Computer Science".

  • The paper discussed most in the news over the past week was "Tracking sex: The implications of widespread sexual data leakage and tracking on porn websites" by Elena Maris et al (Jul 2019), which was referenced 198 times, including in the article If You Watch Porn, Facebook, Google, Oracle Are Tracking That in Moguldom. The paper author, Elena Maris (Postdoctoral researcher at Microsoft), was quoted saying "The fact that the mechanism for adult site tracking is so similar to, say, online retail should be a huge red flag. This isn’t picking out a sweater and seeing it follow you across the web. This is so much more specific and deeply personal". The paper got social media traction with 173 shares. The investigators explore tracking and privacy risks on pornography websites. On Twitter, @citadelo observed "This research focuses on porn-sites users' tracking: "analysis of 22,484 pornography websites indicated that 93% leak user data to a third party." Unfortunately even incognito window does not solve the entire problem. Another reason to use".

  • Leading researcher Pieter Abbeel (University of California, Berkeley) published "DoorGym: A Scalable Door Opening Environment And Baseline Agent" @hardmaru tweeted "DoorGym: A Scalable Door Opening Environment And Baseline Agent They introduce a highly configurable door simulation environment, as a step to move RL from toy environments towards atomic skills that can be composed and extended towards a broader goal".

  • The paper shared the most on social media this week is by a team at Georgia Institute of Technology: "ViLBERT: Pretraining Task-Agnostic Visiolinguistic Representations for Vision-and-Language Tasks" by Jiasen Lu et al (Aug 2019) with 168 shares. @vykthur (Victor Dibia) tweeted "They show that learning joint representations of image content and natural language using a BERT based architecture (multi-modal two-stream model, co-attentional transformer layers) improves performance for image+language tasks such as visual question answering, etc".

Over the past week, 36 new papers were published in "Computer Science - Artificial Intelligence".

Over the past week, 158 new papers were published in "Computer Science - Computer Vision and Pattern Recognition".

  • The paper discussed most in the news over the past week was by a team at UC Berkeley: "Natural Adversarial Examples" by Dan Hendrycks et al (Jul 2019), which was referenced 19 times, including in the article Semantic Based Adversarial Examples Fool Face Recognition in SyncedReview.com. The paper author, Steven Basart, was quoted saying "Anyone willing to test their models against our data set is free to do so". The paper got social media traction with 538 shares. A user, @DanHendrycks, tweeted "Natural Adversarial Examples are real-world and unmodified examples which cause classifiers to be consistently confused. The new dataset has 7,500 images, which we personally labeled over several months. Paper: Dataset and code".

  • Leading researcher Dhruv Batra (Georgia Institute of Technology) came out with "ViLBERT: Pretraining Task-Agnostic Visiolinguistic Representations for Vision-and-Language Tasks", which had 22 shares over the past 3 days. @vykthur tweeted "They show that learning joint representations of image content and natural language using a BERT based architecture (multi-modal two-stream model, co-attentional transformer layers) improves performance for image+language tasks such as visual question answering, etc". This paper was also shared the most on social media with 168 tweets. @vykthur (Victor Dibia) tweeted "They show that learning joint representations of image content and natural language using a BERT based architecture (multi-modal two-stream model, co-attentional transformer layers) improves performance for image+language tasks such as visual question answering, etc".

Over the past week, 20 new papers were published in "Computer Science - Computers and Society".

This week was active for "Computer Science - Human-Computer Interaction", with 26 new papers.

This week was active for "Computer Science - Learning", with 243 new papers.

  • The paper discussed most in the news over the past week was by a team at UC Berkeley: "Natural Adversarial Examples" by Dan Hendrycks et al (Jul 2019),

  • Leading researcher Pieter Abbeel (University of California, Berkeley) came out with "Dimensionality Reduction Flows" The authors propose methods to reduce the latent space dimension of flow models. @serrjoa tweeted "I've been thinking on how to reduce dimensionality in normalizing flows for a while now without success (involving matrix pseudo inverses and the like). Now I have something new to think on".

  • The paper shared the most on social media this week is by a team at DeepMind: "Behaviour Suite for Reinforcement Learning" by Ian Osband et al (Aug 2019) with 484 shares. The investigators introduce the Behaviour Suite for Reinforcement Learning, or bsuite for short. @koraykv (koray kavukcuoglu) tweeted "Open sourcing bsuite. Automated evaluation and analysis of agents on RL benchmarks. We hope this will help reproducible and accessible research on core problems in RL. Looking forward to seeing results from the community!".

Over the past week, 11 new papers were published in "Computer Science - Multiagent Systems".

Over the past week, 16 new papers were published in "Computer Science - Neural and Evolutionary Computing". 

This week was active for "Computer Science - Robotics", with 48 new papers.


EYE ON A.I. GETS READERS UP TO DATE ON THE LATEST FUNDING NEWS AND RELATED ISSUES. SUBSCRIBE FOR THE WEEKLY NEWSLETTER.