The transfer learning-based fine-tuning is a promising method to conquer the challenges of on-device learning, including few training data, limited computational resources, and strict parameter update rules. In brief, this method can provide the following four advantages for on-device learning: (1) reducing the computational overhead of network training; (2) avoiding model overfitting caused by few training data; (3) accelerating the entire training speed; and (4) providing highly personalized models for various applications. This chapter discusses diverse modern techniques to optimize the models of neural networks, which provides the opportunities to saving computational and communication overhead. These techniques makes machine learning methods affordable on tiny devices and bring AI applications in human daily life.