Week Ending 6.14.2020

 

RESEARCH WATCH: 6.14.2020

 
ai-research.png

This week was active for "Computer Science - Artificial Intelligence", with 123 new papers.

This week was active for "Computer Science - Computer Vision and Pattern Recognition", with 273 new papers.

  • The paper discussed most in the news over the past week was "YOLOv4: Optimal Speed and Accuracy of Object Detection" by Alexey Bochkovskiy et al (Apr 2020), which was referenced 9 times, including in the article One stop for object detectors in Medium.com. The paper got social media traction with 834 shares. A Twitter user, @__usamah___, posted "idk why but reading a YOLO paper without pjreddie (and his antics) made me feel sad".

  • Leading researcher Trevor Darrell (UC Berkeley) came out with "Quasi-Dense Instance Similarity Learning" The investigators present a simple yet effective quasi - dense matching method to learn instance similarity from hundreds of region proposals in a pair of images.

  • The paper shared the most on social media this week is by a team at University of Michigan: "VirTex: Learning Visual Representations from Textual Annotations" by Karan Desai et al (Jun 2020) with 405 shares. @NeurAutomata (NeurAutomata) tweeted "karpathy "RT jcjohnss: Our new paper (w/kdexd) argues that "language is all you need" for good visual features: we train CNN+Transformer *from scratch* on ~100k images+captions from COCO, transfer the CNN to 6 downstream vision tasks, and match/ex".

This week was very active for "Computer Science - Computers and Society", with 59 new papers.

This week was active for "Computer Science - Human-Computer Interaction", with 31 new papers.

This week was extremely active for "Computer Science - Learning", with 749 new papers.

  • The paper discussed most in the news over the past week was by a team at Rice University: "SmartExchange: Trading Higher-cost Memory Storage/Access for Lower-cost Computation" by Yang Zhao et al (May 2020), which was referenced 8 times, including in the article Rice engineers offer smart, timely ideas for AI bottlenecks in Rice University. The paper author, Yingyan Lin (Rice University), was quoted saying "It can cost about 200 times more energy to access the main memory — the DRAM — than to perform a computation, so the key idea for SmartExchange is enforcing structures within the algorithm that allow us to trade higher-cost memory for much-lower-cost computation". The paper got social media traction with 8 shares.

  • Leading researcher Oriol Vinyals (DeepMind) published "Pointer Graph Networks" The investigators introduce Pointer Graph Networks (PGNs) which augment sets or graphs with additional inferred edges for improved model expressivity.

  • The paper shared the most on social media this week is by a team at Microsoft: "Linformer: Self-Attention with Linear Complexity" by Sinong Wang et al (Jun 2020) with 320 shares. @veydpz_public (Seung-won Park) tweeted "Unbelievable. The key-query matching can be reduced from O(n^2) to O(n*k), where projected dimension k can be set constant regardless of sequence length n".

This week was active for "Computer Science - Multiagent Systems", with 27 new papers.

This week was active for "Computer Science - Neural and Evolutionary Computing", with 49 new papers.

  • The paper discussed most in the news over the past week was by a team at Stanford University: "Machine Learning on Graphs: A Model and Comprehensive Taxonomy" by Ines Chami et al (May 2020), which was referenced 1 time, including in the article London Bike Ride Forecasting with Graph Convolutional Networks in Towards Data Science. The paper also got the most social media traction with 329 shares. On Twitter, @kerstingAIML posted "Nice overview & conceptualization of (differentiable) approaches to learning on graphs. It is really important to get overviews & unifying views. 🙏 Follow up could be on learning with graphs, showing also the strong connection to graph kernels (via WL & neural fingerprints etc.)".

  • The paper shared the most on social media this week is "A bio-inspired bistable recurrent cell allows for long-lasting memory" by Nicolas Vecoven et al (Jun 2020) with 124 shares. @glouppe (Gilles Louppe) tweeted "Who knew? Changing the reset gate in GRU so that the gate can take values in ]0,20,1[ makes the cell bistable, hence enabling long-lasting memories! Very nice work by my colleagues and Guillaume Drion!".

This week was very active for "Computer Science - Robotics", with 66 new papers.

  • The paper discussed most in the news over the past week was by a team at University of Zurich and ETH Zurich: "Deep Drone Acrobatics" by Elia Kaufmann et al (Jun 2020), which was referenced 3 times, including in the article Researchers train drones to perform flips, rolls, and loops with AI in Venturebeat. The paper was shared 1 time in social media. The researchers propose to learn a sensorimotor policy that enables an autonomous quadrotor to fly extreme acrobatic maneuvers with only onboard sensing and computation.


EYE ON A.I. GETS READERS UP TO DATE ON THE LATEST FUNDING NEWS AND RELATED ISSUES. SUBSCRIBE FOR THE WEEKLY NEWSLETTER.