Week Ending 10.3.2021
RESEARCH WATCH: 10.3.2021
This week was active for "Computer Science", with 1,284 new papers.
The paper discussed most in the news over the past week was by a team at Stanford University: "On the Opportunities and Risks of Foundation Models" by Rishi Bommasani et al (Aug 2021), which was referenced 24 times, including in the article Best of arXiv—Readings for October 2021 in Towards Data Science. The paper author, Rishi Bommasani, was quoted saying "The commercial incentive can lead companies to ignore social externalities such as the technological displacement of labor, the health of an informational ecosystem required for democracy, the environmental cost of computing resources, and the profit-driven sale of technologies to non-democratic regimes".
Leading researcher Ruslan Salakhutdinov (Carnegie Mellon University) came out with "FewNLU: Benchmarking State-of-the-Art Methods for Few-Shot Natural Language Understanding".
This week was very active for "Computer Science - Artificial Intelligence", with 202 new papers.
The paper discussed most in the news over the past week was by a team at Stanford University: "On the Opportunities and Risks of Foundation Models" by Rishi Bommasani et al (Aug 2021)
Leading researcher Sergey Levine (University of California, Berkeley) published "Bridge Data: Boosting Generalization of Robotic Skills with Cross-Domain Datasets".
This week was active for "Computer Science - Computer Vision and Pattern Recognition", with 267 new papers.
The paper discussed most in the news over the past week was by a team at Ben-Gurion University of the Negev: "Dodging Attack Using Carefully Crafted Natural Makeup" by Nitzan Guetta et al (Sep 2021), which was referenced 15 times, including in the article Best of arXiv—Readings for October 2021 in Towards Data Science.
Leading researcher Luc Van Gool (Computer Vision Laboratory) came out with "PDC-Net+: Enhanced Probabilistic Dense Correspondence Network".
This week was active for "Computer Science - Computers and Society", with 36 new papers.
The paper discussed most in the news over the past week was by a team at Stanford University: "On the Opportunities and Risks of Foundation Models" by Rishi Bommasani et al (Aug 2021)
This week was very active for "Computer Science - Human-Computer Interaction", with 50 new papers.
The paper discussed most in the news over the past week was "Baby Robot: Improving the Motor Skills of Toddlers" by Eric Cañas et al (Sep 2021), which was referenced 3 times, including in the article Baby Robot: A system that helps toddlers practice their motor skills in Tech Xplore. The paper author, Alba M. G. García, was quoted saying "During our experiments, we observed that the engagement of the babies that were playing with the Baby Robot toy substitute significantly increased, leading them to be in movement 3.1 times longer and to travel 4.4 times more meters than those in the control condition".
This week was extremely active for "Computer Science - Learning", with 470 new papers.
The paper discussed most in the news over the past week was by a team at Stanford University: "On the Opportunities and Risks of Foundation Models" by Rishi Bommasani et al (Aug 2021)
Leading researcher Ruslan Salakhutdinov (Carnegie Mellon University) came out with "FewNLU: Benchmarking State-of-the-Art Methods for Few-Shot Natural Language Understanding".
This week was active for "Computer Science - Multiagent Systems", with 21 new papers.
Over the past week, 29 new papers were published in "Computer Science - Neural and Evolutionary Computing".
The paper discussed most in the news over the past week was by a team at Google: "Primer: Searching for Efficient Transformers for Language Modeling" by David R. So et al (Sep 2021), which was referenced 2 times, including in the article Best of arXiv—Readings for October 2021 in Towards Data Science.
This week was extremely active for "Computer Science - Robotics", with 135 new papers.
The paper discussed most in the news over the past week was by a team at University of Washington: "CLIPort: What and Where Pathways for Robotic Manipulation" by Mohit Shridhar et al (Sep 2021), which was referenced 7 times, including in the article Faster grasp by contrast? Neuron. OpenAI teaches robots to get a grip with CLIP in TheRegister.com. The paper author, Mohit Shridhar (University of Washington), was quoted saying "CLIPort's capabilities are only limited to the actions shown during training demonstrations. If it's trained to 'stack two blocks,' and you ask it 'make a tower of 5 blocks,' it won't know how to do so. All the verbs are also tightly linked to the training demonstrations, in the sense that they won't do anything beyond the action-skills learnt during training".
Leading researcher Sergey Levine (University of California, Berkeley) published "Bridge Data: Boosting Generalization of Robotic Skills with Cross-Domain Datasets".