pyforest lazy-imports all popular Python Data Science libraries so that they are always there when you need them. Once you use a package, pyforest imports it and even adds the import statement to your first Jupyter cell. If you don't use a library, it won't be imported.
After you installed pyforest and its Jupyter extension, you can use your favorite Python Data Science commands like you normally would - just without writing imports.
For example, if you want to read a CSV with pandas:
df = pd.read_csv("titanic.csv")
pyforest will automatically import pandas for you and add the import statement to the first cell:
import pandas as pd
Which libraries are available?
- We aim to add all popular Python Data Science libraries which should account for >99% of your daily imports. For example, we already added
sklearn and many more. In addition, there are also helper modules like
- You can see an overview of all currently available imports here
- If you are missing an import, you can either add the import to your user specific pyforest imports as described in the FAQs or you can open a pull request for the official pyforest imports
In order to gather all the most important names, we need your help. Please open a pull request and add the imports that we are still missing.
You need Python 3.6 or above because we love f-strings.
From the terminal (or Anaconda prompt in Windows), enter:
pip install --upgrade pyforest python -m pyforest install_extensions
Installing the Jupyter extensions is also possible from within Python:
import pyforest pyforest.install_nbextension() pyforest.install_labextension() # takes 30-60s due to jupyter lab build
Also, please note that this will add pyforest to your IPython default startup settings. If you do not want this, you can disable the auto_import as described in the FAQs below.
"How to add my own import statements without adding them to the package source code?"
~/.pyforest/user_imports.pyin which you can type any explicit import statements you want (e.g.
import pandas as pd). Your own custom imports take precedence over any other pyforest imports. Please note: implicit imports (e.g.
from pandas import *) won't work.
"Doesn't this slow down my Jupyter or Python startup process?"
pdare only pyforest placeholders.
"Why can't I just use the typical IPython import?"
"I don't have and don't need tensorflow. What will happen when I use pyforest?"
"Will the pyforest variables interfere with my own local variables?"
"What about auto-completion on lazily imported modules?"
"How to (temporarily) deactivate the auto_import in IPython and Jupyter?"
~/.ipython/profile_default/startupand adjust or delete the
pyforest_autoimport.pyfile. You will find further instructions in the file. If you don't use the auto_import, you will need to import pyforest at the beginning of your notebook via
"How to (re)activate the pyforest auto_import?"
from pyforest.auto_import import setup; setup(). Please note that the auto_import only works for Jupyter and IPython.
"Can I use pyforest outside of the Jupyter Notebook or Lab?"
import pyforest. Afterwards, you can get the currently active imports via
"Why is the project called pyforest?"
In order to gather all the most important names, we need your help. Please open a pull request and add the imports that we are still missing to the pyforest imports. You can also find the guidelines in the pyforest imports file
pyforest helps you to minimize the (initial) import time of your package which improves the user experience. If you want your package imports to become lazy, rewrite your imports as follows:
import pandas as pd
from pyforest import LazyImport pd = LazyImport("import pandas as pd")
pyforest is developed by 8080 Labs. Our goal is to improve the productivity of Python Data Scientists. If you like the speedup to your workflow, you might also be interested in our other project bamboolib
If you - like our work or - want to become a faster Python Data Scientist or - want to discuss the future of the Python Data Science ecosystem or - are just interested in mingling with like-minded fellows
then, you are invited to join our slack.