The task of automatic gesture segmentation is highly challenging due to the computational burden, the presence of unpredictable body motion and ambiguous nongesture hand motion. In this paper, a new approach is developed using Hausdorff based model tracking technique for the application of real-time human-computer interaction. This paper proposed a Three Phases Model Tracking approach, which consists of two main stages; one is motion history analysis, which classifies dynamic gesture into preparation, retraction and nucleus state based on temporal relationship. The other is model tracking, which tracks signer model and object model with different constraint based on the classified state. Finally, gesture model is extracted based on matching object model and signer model and the hand gesture region is segmented from the gesture model. Experiments are performed to test the robustness of gesture segmentation under various hand scale and complex background. The segmentation error rate and computational complexity are also analyzed to demonstrate that the proposed Three Phases Model Tracking approach can be applicable to real-time human-computer interaction system.