Skip to content
Snippets Groups Projects
  1. Feb 15, 2024
    • Alexandru-Mihai GHERGHESCU's avatar
      Add fp16 mixed precision training · 6db26eb1
      Alexandru-Mihai GHERGHESCU authored
      This should give training a theoretical 2x speedup in time (though in
      practice that's not usually the case), with close to no loss in
      performance.
      
      The interface allows the user to choose between mixed precision or no
      mixed precision training, which falls back to normal float32 precision.
      
      CPU support for training has been dropped, as it takes (with or without
      mixed precision) much much longer to train than on GPU's, and it's not
      really an alternative anyone considers. With the addition of mixed
      precision, supporting both CPU and GPU would complicate things too much,
      therefore CPU training support has been dropped.
      6db26eb1
  2. Jan 30, 2024
  3. Jan 29, 2024
  4. Jan 28, 2024
  5. Jan 26, 2024
  6. Jan 25, 2024
  7. Jan 24, 2024
  8. Jan 22, 2024
  9. Jan 18, 2024
  10. Jan 12, 2024
  11. Jan 11, 2024
  12. Jan 09, 2024
Loading