Recent years, end-to-end task-oriented dialogue systems have made a remarkable breakthrough. However, existing dialogue models tend to equally summarize all the history as the context representation and apply memory networks to incorporate external knowledge. They neglect to highlight the latest request of users, which will cause the dialogue system to generate improper responses. In addition, it is insufficient for the original memory networks to interact between memories only at hop-level and difficult to extract more useful knowledge information. To address these issues, we propose a novel neural model which incorporates Dual-Aware with Hierarchical Interactive Memory Networks (DA-HIMN). The dual-aware constituting static request-aware and dynamic KB-aware is responsible for capturing the latest request of users and collecting related knowledge information. Furthermore, we design a hierarchical interaction mechanism to augment the memory networks at layer-level to more adequately learn the knowledge representation. Our experimental results demonstrate that our model outperforms the baseline model on two task-oriented dialogue datasets in several evaluation metrics.