Learning to Segment Every Thing

dwSun 2018-01-15 20:39
Existing methods for object instance segmentation require all training instances to be labeled with segmentation masks. This requirement makes it expensive to annotate new categories and has restricte

Going Deeper with Convolutions

辉仔 2018-01-10 11:12
We propose a deep convolutional neural network architecture codenamed "Inception", which was responsible for setting the new state of the art for classification and detection in the ImageNet Large-Sca

Learning Transferable Architectures for Scalable Image Recognition

jixiaohui 2018-01-10 15:36
Developing neural network image classification models often requires significant architecture engineering. In this paper, we attempt to automate this engineering process by learning the model architec

Rethinking the Inception Architecture for Computer Vision

辉仔 2018-01-10 11:10
Convolutional networks are at the core of most state-of-the-art computer vision solutions for a wide variety of tasks. Since 2014 very deep convolutional networks started to become mainstream, yieldin

非科班出身,我是如何自己摸索研究卷积神经网络体系结构的

人工智能头条 2018-10-12 16:38
作者的处境,也是国内很多刚开始学习人工智能的开发者、学生面临的情况。所以这篇文章,很有现实意义

Inverted Residuals and Linear Bottlenecks: Mobile Networks for Classification, Detection and Segmentation

dwSun 2018-02-06 17:25
In this paper we describe a new mobile architecture, MobileNetV2, that improves the state of the art performance of mobile models on multiple tasks and benchmarks as well as across a spectrum of diffe

Deep Residual Learning for Image Recognition

jixiaohui 2018-01-10 11:15
Deeper neural networks are more difficult to train. We present a residual learning framework to ease the training of networks that are substantially deeper than those used previously. We explicitly re

Identity Mappings in Deep Residual Networks

jixiaohui 2018-01-10 12:10
Deep residual networks have emerged as a family of extremely deep architectures showing compelling accuracy and nice convergence behaviors. In this paper, we analyze the propagation formulations behin

Dynamic Routing Between Capsules

dwSun 2018-02-06 16:16
A capsule is a group of neurons whose activity vector represents the instantiation parameters of a specific type of entity such as an object or an object part. We use the length of the activity vector

Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

辉仔 2018-01-10 10:49
Training Deep Neural Networks is complicated by the fact that the distribution of each layer's inputs changes during training, as the parameters of the previous layers change. This slows down the trai
关注微信公众号