Entity Linking (EL) recognizes textual mentions of entities and maps them to the corresponding entities in a Knowledge Graph (KG). In this paper, we propose a novel method for EL on short text using entity representations base on their name labels, descriptions, and other related entities in the KG. We then leverage a pre-trained BERT model to calculate the semantic similarity between the entity and the text. This method does not require a large volume of data to jointly train word and entity representations, and is easily portable to a new domain with a KG. We demonstrate that our approach outperforms previous methods on a public benchmark dataset with a large margin.