ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators

ELECTRA:判别式的预训练语言模型.

ACNet: Strengthening the Kernel Skeletons for Powerful CNN via Asymmetric Convolution Blocks

ACNet:深度网络重参数化.

ALBERT: A Lite BERT for Self-supervised Learning of Language Representations

ALBERT:一种轻量型的BERT模型.

Keep it SMPL: Automatic Estimation of 3D Human Pose and Shape from a Single Image

从单张图像中建立三维SMPL模型.

使用json库进行OpenPose输出关节点转换(25→18)

Change 25 human joints to 18 human joints json-file for Openpose.

Language Models are Unsupervised Multitask Learners

GPT2:语言模型是无监督的多任务模型.