News

A collection of original, innovative ideas and algorithms towards Advanced Literate Machinery. This project is maintained by the OCR Team in the Language Technology Lab, Tongyi Lab, Alibaba Group.
In this paper, a high-efficiency encoder-decoder structure, inspired by the top-down attention mechanism in human brain perception and named human-like perception attention network (HPANet), is ...
This advance is powered by a novel Transfusion architecture described ... based image generator into the transformer’s sequence modeling framework. The core of Transfusion is a single transformer ...
Lack of Introspection: Unless specifically instrumented, transformer-based LLMs have no ability to explicitly access their own internal states—the activations in their feed-forward layers, attention ...
These embeddings are then processed by a novel architecture ... we used both Transformer and CNN to predict whether a gene was related to epilepsy. Specifically, we used their numerical features as ...