Adding our Environment Dependencies in Requirements Text Files - pathfinder-analytics-uk/dab_project GitHub Wiki
Commands
Generating the requirements file for the pyspark environment
First, activate the .venv_pyspark
environment.
Then type:
pip freeze > requirements-pyspark.txt
Generating the requirements file for the databricks connect environment
First, activate the .venv_dbc
environment.
Then type:
pip freeze > requirements-dbc.txt
Project Code
requirements-pyspark.txt
coverage==7.8.0
googleapis-common-protos==1.63.2
grpcio==1.60.0
grpcio-status==1.60.0
iniconfig==2.1.0
numpy==1.23.5
packaging==25.0
pandas==1.5.3
pluggy==1.5.0
protobuf==5.29.4
py4j==0.10.9.7
pyarrow==14.0.1
pyspark==3.5.0
pytest==8.3.5
pytest-cov==6.1.1
python-dateutil==2.9.0.post0
pytz==2025.2
six==1.17.0
requirements-dbc.txt
appnope==0.1.4
asttokens==3.0.0
cachetools==5.5.2
certifi==2025.1.31
charset-normalizer==3.4.1
comm==0.2.2
databricks-connect==15.4.0
databricks-sdk==0.49.0
debugpy==1.8.13
decorator==5.2.1
executing==2.2.0
google-auth==2.38.0
googleapis-common-protos==1.69.2
grpcio==1.71.0
grpcio-status==1.71.0
idna==3.10
iniconfig==2.1.0
ipykernel==6.29.5
ipython==9.1.0
ipython_pygments_lexers==1.1.1
jedi==0.19.2
jupyter_client==8.6.3
jupyter_core==5.7.2
matplotlib-inline==0.1.7
nest-asyncio==1.6.0
numpy==1.26.4
packaging==24.2
pandas==2.2.3
parso==0.8.4
pexpect==4.9.0
platformdirs==4.3.7
pluggy==1.5.0
prompt_toolkit==3.0.50
protobuf==5.29.4
psutil==7.0.0
ptyprocess==0.7.0
pure_eval==0.2.3
py4j==0.10.9.7
pyarrow==19.0.1
pyasn1==0.6.1
pyasn1_modules==0.4.2
Pygments==2.19.1
pytest==8.3.5
python-dateutil==2.9.0.post0
pytz==2025.2
pyzmq==26.4.0
requests==2.32.3
rsa==4.9
six==1.17.0
stack-data==0.6.3
tornado==6.4.2
traitlets==5.14.3
typing_extensions==4.13.1
tzdata==2025.2
urllib3==2.3.0
wcwidth==0.2.13