It is widely believed that AR glasses will be the next-generation personal computing platform. Lightweight, powerful AR glasses with an all-day battery life have the potential to give users hands-free access to any information, anytime, anywhere without the need for any physical displays. An intelligent AR interface will need to handle challenges such as avoiding the occlusion of important real-world objects, using real-world surfaces when appropriate, and determining how and when content should move along with the users. However, due to the fact that such interfaces barely exist nowadays, there are still limited insights into how such interfaces would look like, and how they would adapt to different contexts. In this demonstration, we showcase our envisioned future of context-aware AR interface. Users will be able to experience the typical life of a future all-day AR user and see how AR interfaces could adapt themselves to different contexts. The interface we designed will be able to 1) avoid occluding important real-world objects, 2) be aware of user activities and change fixation, and 3) be conversation-sensitive to provide information that users need. We hope that our showcase will allow a glimpse of the future, and further inspire the future design of adaptive interfaces for head-worn AR displays.