Padertorch Course

Welcome to Padertorch Online Course with live Instructor using an interactive cloud desktop environment DaDesktop. Experience remote live training using an interactive, remote desktop led by a human being!

#deep learning Training

7 hours


What is Padertorch?

Padertorch is designed to simplify the training of deep learning models written with PyTorch. While focusing on speech and audio processing, it is not limited to these application areas. This repository is currently under construction. The examples in contrib/examples are only working in the Paderborn NT environment



  • Logging:

    • The review of the model returns a dictionary that will be logged and visualized via 'tensorboard'. The keys define the logging type (e.g. scalars).

    • As logging backend we use TensorboardX to generate a tfevents file that can be visualized from a tensorboard.

  • Dataset type:

    • lazy_dataset.Dataset, and other iterables...

  • Validation:

    • The ValidationHook runs periodically and and logs the validations results.

  • Learning rate decay with backoff:

    • The ValidationHook has also parameters to do a learning rate with backoff.

  • Test run:

    • The trainer has a test run function to train the model for few iterations and test if

      • the model is executable (burn test)

      • the validation is deterministic/reproducable

      • the model changes the parameter in the training

  • Hooks:

    • The hooks are used to extend the basic features of the trainer. Usually the user dont rearly care about the hooks. By default a SummaryHook, a CheckpointHook and a StopTrainingHook is registerd. So the user only need to register a ValidationHook

  • Checkpointing:

    • The parameters of the model and the state of the trainer are periodically saved. The intervall can be specified with the checkpoint_trigger (The units are epoch and iteration)

  • Virtual minibatch:

    • The Trainer usually do not know if the model is trained with a single example or multiple examples (minibatch), because the exaples that are yielded from the dataset are directly forwarded to the model.

    • When the virtual_minibatch_size option is larger than one, the trainer calls the forward and backward step virtual_minibatch_size times before applying the gradients. This increases the minibatch size, while the memory consumption stays similar.

Would you like to learn Padertorch?

Simply, click the "Book" button of Padertorch and proceed to the payment method. Enter your desired schedule of training. You will receive an email confirmation for Padertorch and a representative / trainer will get in touch with you.

Course Category:

   Programming Training

Last Updated:


Course Schedules

Date Time
July 18, 2022 (Monday) 09:30 AM - 04:30 PM
August 1, 2022 (Monday) 09:30 AM - 04:30 PM
August 15, 2022 (Monday) 09:30 AM - 04:30 PM
August 29, 2022 (Monday) 09:30 AM - 04:30 PM
September 12, 2022 (Monday) 09:30 AM - 04:30 PM
September 26, 2022 (Monday) 09:30 AM - 04:30 PM

Padertorch consultancy is available.

Let us know how we can help you.