diff --git a/README.md b/README.md
index 72f6441766a3c11ef32757db0326b8052b7b8d37..01c2bdfa303d790bc4ba94cdad294b549dd0ae57 100644
--- a/README.md
+++ b/README.md
@@ -1,6 +1,6 @@
 # Optimus Prime
 
-(Yet another) PyTorch framework for training large language models.
+(Yet another) PyTorch framework for training large language models
 
 ## How to use
 
@@ -17,7 +17,7 @@ with adjustable top-p threshold and temperature values.
 
 ## Basic building blocks
 
-As its parent PyTorch, the framework is split between a number of modules. The
+As PyTorch itself, this framework is split between a number of modules. The
 most important modules are the `OptimusDataLoader`, the `Dataset`s, the
 `Trainer`, the tokenizers and the models. These can be combined and adapted in
 any way, shape or form to train a model from scratch.
@@ -45,7 +45,7 @@ Of course, any number of the above can be used as defaults.
 There are a number of packages required to run the framework. Get your closest
 Python retailer and ask him to run the following command:
 
-`pip install torch fire sentencepiece fastprogress matplotlib`
+`pip install torch fire sentencepiece fastprogress`
 
 ## License