From 209826e4e39caaa9981603e089aa2a271a7cf05f Mon Sep 17 00:00:00 2001
From: Alexandru Gherghescu <gherghescu_alex1@yahoo.ro>
Date: Mon, 3 Jun 2024 22:24:11 +0300
Subject: [PATCH] README cleanup

---
 README.md | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/README.md b/README.md
index 72f6441..01c2bdf 100644
--- a/README.md
+++ b/README.md
@@ -1,6 +1,6 @@
 # Optimus Prime
 
-(Yet another) PyTorch framework for training large language models.
+(Yet another) PyTorch framework for training large language models
 
 ## How to use
 
@@ -17,7 +17,7 @@ with adjustable top-p threshold and temperature values.
 
 ## Basic building blocks
 
-As its parent PyTorch, the framework is split between a number of modules. The
+As PyTorch itself, this framework is split between a number of modules. The
 most important modules are the `OptimusDataLoader`, the `Dataset`s, the
 `Trainer`, the tokenizers and the models. These can be combined and adapted in
 any way, shape or form to train a model from scratch.
@@ -45,7 +45,7 @@ Of course, any number of the above can be used as defaults.
 There are a number of packages required to run the framework. Get your closest
 Python retailer and ask him to run the following command:
 
-`pip install torch fire sentencepiece fastprogress matplotlib`
+`pip install torch fire sentencepiece fastprogress`
 
 ## License
 
-- 
GitLab