python - feliyur/exercises GitHub Wiki
See here for an overview of various packages to manage virtual envs in python.
Starting with python 3.3
there's a built-in module venv
to manage virtual environments. It can create a virtual environment at a provided location.
The tool virtualenvwrapper
builds over the above virtual environment capability. It stores virtual environments in a single pre-defined location and provides scripts to manipulate them - list, create and remove, activate, and more.
Installation is desribed here. Shortly, run
pip install virtualenvwrapper
This would normally have installed virtualenvwrapper
for your user (as it didn't get write permission to system site-packages). This creates scripts in $HOME/.local/bin
which then needs to be added to path for the scripts to be found - which can be done in .bashrc
. There it is also worth to specify the full path to python for which virtualenvwrapper
has been installed. In all, need to add the following three lines to .bashrc
, adapting them as required:
export PATH="$PATH:$HOME/.local/bin"
export VIRTUALENVWRAPPER_PYTHON=/usr/local/bin/python3
source virtualenvwrapper.sh
Command to launch:
export TMPDIR=/tmp/$USER; mkdir -p $TMPDIR; tensorboard serve --logdir="logs/root/dir (searches recursively)" --bind_all
Default port: 6006. To remote connect use the --bind_all
argument, and either open https://: or use port mapping: ssh <tensorboard server ip> -L<local-port>:localhost:6006
(and use the --bind_all
argument).
pip install jupyterlab
pip install ipykernel
python -m ipykernel install --user --name ${VIRTUAL_ENV##*\/}
Default port: 8888. To remote connect, must use port mapping through ssh:
ssh <jupyterlab server ip> -L<local-port>:localhost:8888
jupyterlab vim |
https://python-packaging.readthedocs.io/en/latest/command-line-scripts.html
setup(
...
entry_points = {
'console_scripts': ['funniest-joke=my.module:function'],
}
scripts = ['path/to/script/file.sh'] # (path w.r.t. setup.py)
...
)
Further docs here.
To step into a model, look into eager_execution
parameter of model.compile()
.
To step into a dataloader, do:
import pydevd
def dataloader_function_to_debug():
pydev.settrace(suspend=False)
possibly need also to set number of workers = 0 when creating the dataloader.
Command | Notes |
---|---|
conda info |
General info, location of venvs |
conda env list |
|
Create a <venv dir>/etc/conda/activate.d or .../deactivate.d
|
Can put in activate/deactivate scripts |
conda clean -a |
Cleans package cache (and frees inodes). Does not happen automatically. |
conda create -n "myenv" python=3.3.0 ipython |
Create with specific python version + ipython package (for example). |
conda list \<package name\> |
show information of package, like pip show
|
conda env remove --name \<environment\> |
remove environment. |
conda env list |
list conda environments |
conda create -p <path-to-directory>/envs/embodiedscan python=3.8 -y
conda config --append envs_dirs <path-to-directory>/envs
MySQL default port: 3306
sudo apt-get install mysql-client
ssh -N -L<local port>:localhost:3306 <sql server host>
Then check that can connect
mysql -u <sql database user> -p -h 127.0.0.1 -P <local port>
The corresponding url should look like:
mysql://<sql database user>@127.0.0.1:<local port>/<database name>
import pandas as pd
df = pd.DataFrame(...)
df.iterrows() |
|
df.iteritems() |
Iterate over columns. |
pd.concat([df_a, df_b], axis=1) |
Concatenate rows / horizontally (axis=1 ) or columns / vertically (axis=0 ). |
df.loc[<condition>] |
Select rows by condition (binary vector) |
df.iloc[indexes] |
Select rows by integer index |
pd.read_csv(<filename>) |
Read csv as a dataframe |
df.to_csv(<filename>) |
Save dataframe to csv |
df.transpose() |
|
df.to_list() , df.to_numpy()
|
When using pip install -e
the installed path is added to a certain file ./lib/python3.8/site-packages/easy-install.pth
(if inside a venv, under its root) - and subsequently turns out in sys.path
.
In Jupyter notebook, when running system commands with !
note that commands run inside the environment jupyter lab had been launched in (and not necessarily in the environment of the selected kernel).
Update January 2023: Standard options include pybind11, cython, SWIG, CPython, boost.python.
CPython
is Python's native interfacing-to-C support. All the rest build upon it.
pybind11
is a C++ library: interface is defined in C++, in a manner reminiscent of boost.python
, but it's a newer library, and doesn't carry with it the weight of boost.
Cython
is a python library: interface is defined in a very python-like language. Can contain python code which can be directly invoked or compiled to make it efficient, even without interfacing C.
In addition to the probably most widely used pybind11
and Cython
there is SWIG
where interface is defined in its own C-like language, and it can generate interfaces also for languages other than python. Finally there is boost.python
which is similar to pybind11
but has been longer time around, on the downside it might have more limited support for some language features, and comes as part of boost.
Older version:
Python has native interfacing-to-C support. Alternative, more powerful methods are mentioned here include ctypes, SWIG, boost.python, and pybind11 (mentioned as a newer and actively developed alternative to boost.python). This discusses the differences of boost.python vs. SWIG.
Update Feb 2019: pybind11 seems superior to boost.python because it is much lighter, does not depend on boost (which was the reason to its development in the first place). SWIG supports interfacing C/C++ to other scripting languages as well.
Update Aug 2019: cython provides a possibly fuller support of C++11 features than boost.python / SWIG.
Requires boost.python binaries and CMake. For building/obtaining boost, can see my wiki page.
- Basic example: https://www.boost.org/doc/libs/1_67_0/libs/python/doc/html/tutorial/index.html
- Exposing classes: https://www.boost.org/doc/libs/1_63_0/libs/python/doc/html/tutorial/tutorial/exposing.html
- Also, excellent tutorial here.
For boost_python3 support need to upgrade to cmake 3.12 and newer. Should not touch the system's CMake (removing it may erroneously delete some of ROS's files). Instead, should install a newer CMake (in some private dir) and use it from there, as described and here (but without the purge
).
Download cmake:
version=3.13
build=2
cd <installation directory>
wget https://github.com/Kitware/CMake/releases/download/v3.13.2/cmake-$version.$build.tar.gz
tar -xzvf cmake-$version.$build.tar.gz
Build and install to prefix:
cd cmake-$version.$build
make
make install
Add to path (e.g. .bashrc):
export PATH=$HOME/cmake-install/bin:$PATH
export CMAKE_PREFIX_PATH=$HOME/cmake-install:$CMAKE_PREFIX_PATH
Error message / symptom | Solution |
---|---|
ImportError: Module not found/ no module named xxx | possible cause - wrong filename extension. Must be .pyd on windows, .so on linux |
ImportError: dynamic module does not define init function ImportError: dynamic module does not define module export function (PyInit_xxx) |
filename different from module name declared by BOOST_PYTHON_MODULE . On Linux often caused by .so/.a files being prefixed with lib . Either add the lib prefix to BOOST_PYTHON_MODULE , or add set_target_properties(xxx PROPERTIES PREFIX "") to CMakeLists.txt . |
SystemError: initialization of <library> raised unreported exception | Need to import any depedent modules before. (e.g. if module uses gtsam library, which is also directly used from python, need to import gtsam before importing module. |
Lacking export of identifier used in interface (BOOST_PYTHON_MODULE block). E.g. ClsModelFake1 exported class inherits from ClsModel, which is not exported. | |
In python3 using boost.python a plethora of initialization-time errors are reported by python as "unreported exception". | |
Segfault on library import | Incompatibility with dependent libraries. E.g. module uses a library compiled with a different/incompatible Eigen library version. |
- Generate an animated GIF with Pillow
- Visualization example snippets
- For parallel
setuptools
builds prependpip install
withMAKEFLAGS="-j<XX>"
where is the number of cores desired. Can also pass other arguments to make, e.g.MAKEFLAGS="-j10 -std=c++17" pip install -e .