Skip to content
Snippets Groups Projects
  • Alexandru-Mihai GHERGHESCU's avatar
    6db26eb1
    Add fp16 mixed precision training · 6db26eb1
    Alexandru-Mihai GHERGHESCU authored
    This should give training a theoretical 2x speedup in time (though in
    practice that's not usually the case), with close to no loss in
    performance.
    
    The interface allows the user to choose between mixed precision or no
    mixed precision training, which falls back to normal float32 precision.
    
    CPU support for training has been dropped, as it takes (with or without
    mixed precision) much much longer to train than on GPU's, and it's not
    really an alternative anyone considers. With the addition of mixed
    precision, supporting both CPU and GPU would complicate things too much,
    therefore CPU training support has been dropped.
    6db26eb1
    History
    Add fp16 mixed precision training
    Alexandru-Mihai GHERGHESCU authored
    This should give training a theoretical 2x speedup in time (though in
    practice that's not usually the case), with close to no loss in
    performance.
    
    The interface allows the user to choose between mixed precision or no
    mixed precision training, which falls back to normal float32 precision.
    
    CPU support for training has been dropped, as it takes (with or without
    mixed precision) much much longer to train than on GPU's, and it's not
    really an alternative anyone considers. With the addition of mixed
    precision, supporting both CPU and GPU would complicate things too much,
    therefore CPU training support has been dropped.
Code owners
Assign users and groups as approvers for specific file changes. Learn more.