Skip to main content
Hugging Face Accelerate is a library that enables the same PyTorch code to run across any distributed configuration, to simplify model training and inference at scale. Accelerate includes a W&B Tracker which we show how to use below. You can also read more about Accelerate Trackers in Hugging Face.

Start logging with Accelerate

To get started with Accelerate and W&B you can follow the pseudocode below:
from accelerate import Accelerator

# Tell the Accelerator object to log with wandb
accelerator = Accelerator(log_with="wandb")

# Initialise your wandb run, passing wandb parameters and any config information
accelerator.init_trackers(
    project_name="my_project", 
    config={"dropout": 0.1, "learning_rate": 1e-2}
    init_kwargs={"wandb": {"entity": "my-wandb-team"}}
    )

...

# Log to wandb by calling `accelerator.log`, `step` is optional
accelerator.log({"train_loss": 1.12, "valid_loss": 0.8}, step=global_step)


# Make sure that the wandb tracker finishes correctly
accelerator.end_training()
Explaining more, you need to:
  1. Pass log_with="wandb" when initialising the Accelerator class
  2. Call the init_trackers method and pass it:
  • a project name via project_name
  • any parameters you want to pass to wandb.init() via a nested dict to init_kwargs
  • any other experiment config information you want to log to your wandb run, via config
  1. Use the .log method to log to Weigths & Biases; the step argument is optional
  2. Call .end_training when finished training

Access the W&B tracker

To access the W&B tracker, use the Accelerator.get_tracker() method. Pass in the string corresponding to a tracker’s .name attribute, which returns the tracker on the main process.
wandb_tracker = accelerator.get_tracker("wandb")

From there you can interact with wandb’s run object like normal:
wandb_tracker.log_artifact(some_artifact_to_log)
Trackers built in Accelerate will automatically execute on the correct process, so if a tracker is only meant to be ran on the main process it will do so automatically.If you want to truly remove Accelerate’s wrapping entirely, you can achieve the same outcome with:
wandb_tracker = accelerator.get_tracker("wandb", unwrap=True)
with accelerator.on_main_process:
    wandb_tracker.log_artifact(some_artifact_to_log)

Accelerate Articles

Below is an Accelerate article you may enjoy

I