Medicine instructions usually contain rich medical relations, and extracting them is very helpful for many downstream tasks such as medicine knowledge graph construction and medicine side-effect prediction. Existing relation extraction (RE) methods usually predict relations between entities from their contexts and do not consider medical knowledge. However, understanding a part of medical relations may need some expert knowledge in the medical field, making it challenging for existing methods to achieve satisfying performances of medical RE. In this paper, we propose a knowledge-enhanced framework for medical RE, which can exploit medical knowledge of medicines to better conduct medical RE on Chinese medicine instructions. We first propose a BERT-CNN-LSTM based framework for text modeling and learn representations of characters from their contexts. Then we learn representations of each entity by aggregating representations of their characters. Besides, we propose a CNN-LSTM based framework for entity modeling and learn entity representations from their relatedness. In addition, there are usually many different instructions for the same medicine, which usually share general knowledge on this medicine. Thus, to obtain medical knowledge of medicines, we annotate relations on a randomly-sampled instruction of each medicine. Then we build knowledge embeddings to represent potential relations between entities from knowledge of medicines. Finally, we use an MLP network to predict relations between entities from their representations and knowledge embeddings. Extensive experiments on a real-world dataset show that our method can significantly outperform existing methods.