What is the warm start feature and how to use it?
Warm start initializes a new training job with weights from an already-trained model, allowing you to build upon previous work instead of starting from scratch. Pass the ID of an existing training job to trainer.fit():original_job_id and inherits its ModelArchitecture, ColumnProcessing, and NeighborSampling configurations, this ensures that incompatible changes to settings like model architecture are not made.
You can only customize the TrainingJobPlan and OptimizationPlan when warm starting.
Benefits of warm start
- Stable embeddings: Update embeddings over time without causing a drastic shift in the embedding space. This helps maintain consistency for downstream tasks such as retrieval or ranking. Achieving stable embeddings may require tuning optimization parameters such as the learning rate (lr) and number of epochs (max_num_epochs)
- Faster retraining: Warm start is especially useful when data distribution(like user behavior) changes dynamically over time and frequent retraining is needed to keep the model up to date.
Requirements and Limitations
This feature is newly introduced and currently available with the following requirements/limitations.- Warm start job is required to have the same task type as the original job
- Forecasting: Not supported
- Classification: Target classes must match the original job exactly.