The variety and richness of clothing is an important thing for the brilliance and fluency of the performance. However, if we need to change our clothes for every show, changing clothes during the performance and even carrying clothes are both problems. To improve these problems and increase performance variability. In this paper, we propose a method for users to project virtual clothing on themselves using a computer with a projector. Kinect will capture the user's body and bones as well as each location and direction. The implementation the three steps method of a fast and efficient space coordinate transforming from camera coordinates to a real-world three-dimensional space coordinate to project and control virtual clothing in real time. Users can choose different costumes in our system. Anyone can easily wear virtual costumes during the show.