# neural-forecast

Chirag Ahuja

02/16/2023, 10:04 AM
While trying TFT, seeing only 20% usage of the GPU. Any suggestion, how to maximize it ?

Chris Gervais

02/17/2023, 9:22 PM
we’ve also seen this, it seems related to the num_workers attribute on the dataloaders. since the pl.Trainer class + data modules are abstracted away in the NeuralForecast core module, the only work around we’ve found is to fit the model with a single epoch so that the trainer class + data modules initialize. then you can copy the datamodule and make any customized changes you want to try from:
that said 1.4.0 just dropped a few days ago and might have some goodies to check out :)