Recursive Recurrent Nets With Attention Modeling For OCR In The Wild | Awesome LLM Papers Add your paper to Awesome LLM Papers

Recursive Recurrent Nets With Attention Modeling For OCR In The Wild

Chen-Yu Lee, Simon Osindero . 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2016 – 458 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
3d Representation CVPR Datasets Evaluation Interdisciplinary Approaches Model Architecture Neural Machine Translation Tools Training Techniques Variational Autoencoders

We present recursive recurrent neural networks with attention modeling (R(^2)AM) for lexicon-free optical character recognition in natural scene images. The primary advantages of the proposed method are: (1) use of recursive convolutional neural networks (CNNs), which allow for parametrically efficient and effective image feature extraction; (2) an implicitly learned character-level language model, embodied in a recurrent neural network which avoids the need to use N-grams; and (3) the use of a soft-attention mechanism, allowing the model to selectively exploit image features in a coordinated way, and allowing for end-to-end training within a standard backpropagation framework. We validate our method with state-of-the-art performance on challenging benchmark datasets: Street View Text, IIIT5k, ICDAR and Synth90k.

Similar Work