conda-forge / mpich-feedstock Goto Github PK
View Code? Open in Web Editor NEWA conda-smithy repository for mpich.
License: BSD 3-Clause "New" or "Revised" License
A conda-smithy repository for mpich.
License: BSD 3-Clause "New" or "Revised" License
Seem to be suffering from a bug you all resolved wrt -fallow-argument-mismatch
Your help is a ppreciated
It looks like only the first Linux build was successfully uploaded:
https://anaconda.org/conda-forge/mpich/files
Is this normal?
No response
Support CUDA.
mpich README contains documentation:
GPU support
***********
GPU support is automatically enabled if CUDA, ZE, or HIP runtime is
detected during configure. To specify where your GPU runtime is
installed, use:
--with-cuda=<path> or --with-ze=<path> or --with-hip=<path>
If the lib/ and include/ are not in the same path, both can be specified
separately, for example:
--with-cuda-include= and --with-cuda-lib=
In addition, GPU support can be explicitly disabled by using:
--without-cuda or --without-ze or --without-hip
If desirable, GPU support can be disabled during runtime by setting
environment variable MPIR_CVAR_ENABLE_GPU=0. This may help avoid the GPU
initialization and detection overhead for non-GPU applications.
Issue:
Please see conda/conda-build#3438 for a full description and minimal reproducer. The conda-build
team feels that this is an issue with the mpich
package. I think I see where the out-of-tree library path is being hardcoded into executables built with e.g. the mpifort
wrapper script, but I don't understand what that script is doing well enough to know if there's a way I can disable this path or, if that's possible, whether it would render the executable unusable. I'd be happy to use any kind of workaround that might be suggested. Thanks in advance for any ideas.
MPICH should work on Windows, so it would be nice to add this here
Issue:
Executing mpicc
gives:
/bin/bash: .../miniconda3/envs/cfmid-build/lib/libtinfo.so.6: no version information available (required by /bin/bash)
Error: Command line argument is needed!
/bin/bash: .../miniconda3/envs/cfmid-build/lib/libtinfo.so.6: no version information available (required by /bin/bash)
Guess this is a problem with ncurses ...? Apparently makes cmake fail to find the MPI installation.
conda list
):
conda list
# packages in environment at /home/berntm/miniconda3/envs/cfmid-build:
#
# Name Version Build Channel
_libgcc_mutex 0.1 conda_forge conda-forge
_openmp_mutex 4.5 1_gnu conda-forge
boost 1.68.0 py37h8619c78_1001 conda-forge
boost-cpp 1.68.0 h11c811c_1000 conda-forge
bzip2 1.0.8 h7f98852_4 conda-forge
ca-certificates 2020.12.5 ha878542_0 conda-forge
cairo 1.16.0 h18b612c_1001 conda-forge
certifi 2020.12.5 py37h89c1867_0 conda-forge
fontconfig 2.13.1 he4413a7_1000 conda-forge
freetype 2.10.4 h7ca028e_0 conda-forge
gettext 0.19.8.1 h0b5b191_1005 conda-forge
glib 2.66.3 h9c3ff4c_1 conda-forge
icu 58.2 hf484d3e_1000 conda-forge
jpeg 9d h36c2ea0_0 conda-forge
lcms2 2.11 hcbb858e_1 conda-forge
ld_impl_linux-64 2.35.1 hed1e6ac_0 conda-forge
libblas 3.9.0 3_openblas conda-forge
libcblas 3.9.0 3_openblas conda-forge
libffi 3.3 h58526e2_1 conda-forge
libgcc-ng 9.3.0 h5dbcf3e_17 conda-forge
libgfortran-ng 7.5.0 hae1eefd_17 conda-forge
libgfortran4 7.5.0 hae1eefd_17 conda-forge
libglib 2.66.3 h1f3bc88_1 conda-forge
libgomp 9.3.0 h5dbcf3e_17 conda-forge
libiconv 1.16 h516909a_0 conda-forge
liblapack 3.9.0 3_openblas conda-forge
liblbfgs 1.10 h6bb024c_0 conda-forge
libopenblas 0.3.12 pthreads_hb3c22a3_1 conda-forge
libpng 1.6.37 h21135ba_2 conda-forge
libstdcxx-ng 9.3.0 h2ae2ef3_17 conda-forge
libtiff 4.1.0 h4f3a223_6 conda-forge
libuuid 2.32.1 h7f98852_1000 conda-forge
libwebp-base 1.1.0 h36c2ea0_3 conda-forge
libxcb 1.13 h14c3975_1002 conda-forge
libxml2 2.9.9 h13577e0_2 conda-forge
lp_solve 5.5.2.5 h14c3975_1001 conda-forge
lz4-c 1.9.2 he1b5a44_3 conda-forge
mpi 1.0 mpich conda-forge
mpich 3.3.2 h846660c_5 conda-forge
ncurses 6.2 h58526e2_4 conda-forge
numpy 1.19.4 py37h7e9df27_1 conda-forge
olefile 0.46 pyh9f0ad1d_1 conda-forge
openssl 1.1.1h h516909a_0 conda-forge
pandas 1.1.4 py37h10a2094_0 conda-forge
pcre 8.44 he1b5a44_0 conda-forge
pillow 8.0.1 py37h63a5d19_0 conda-forge
pip 20.3.1 pyhd8ed1ab_0 conda-forge
pixman 0.38.0 h516909a_1003 conda-forge
pthread-stubs 0.4 h36c2ea0_1001 conda-forge
pycairo 1.20.0 py37h01af8b0_1 conda-forge
python 3.7.8 hffdb5ce_3_cpython conda-forge
python-dateutil 2.8.1 py_0 conda-forge
python_abi 3.7 1_cp37m conda-forge
pytz 2020.4 pyhd8ed1ab_0 conda-forge
rdkit 2018.09.1 py37h270f4b7_1001 conda-forge
readline 8.0 he28a2e2_2 conda-forge
setuptools 49.6.0 py37he5f6b98_2 conda-forge
six 1.15.0 pyh9f0ad1d_0 conda-forge
sqlite 3.34.0 h74cdb3f_0 conda-forge
tk 8.6.10 hed695b0_1 conda-forge
wheel 0.36.1 pyhd3deb0d_0 conda-forge
xorg-kbproto 1.0.7 h14c3975_1002 conda-forge
xorg-libice 1.0.10 h516909a_0 conda-forge
xorg-libsm 1.2.3 h84519dc_1000 conda-forge
xorg-libx11 1.6.12 h516909a_0 conda-forge
xorg-libxau 1.0.9 h14c3975_0 conda-forge
xorg-libxdmcp 1.1.3 h516909a_0 conda-forge
xorg-libxext 1.3.4 h516909a_0 conda-forge
xorg-libxrender 0.9.10 h516909a_1002 conda-forge
xorg-renderproto 0.11.1 h14c3975_1002 conda-forge
xorg-xextproto 7.3.0 h14c3975_1002 conda-forge
xorg-xproto 7.0.31 h7f98852_1007 conda-forge
xz 5.2.5 h516909a_1 conda-forge
zlib 1.2.11 h516909a_1010 conda-forge
zstd 1.4.5 h6597ccf_2 conda-forge
conda
and system ( conda info
):
conda info
active environment : cfmid-build
active env location : /home/berntm/miniconda3/envs/cfmid-build
shell level : 1
user config file : /home/berntm/.condarc
populated config files : /home/berntm/.condarc
conda version : 4.8.3
conda-build version : 3.18.11
python version : 3.7.3.final.0
virtual packages : __glibc=2.31
base environment : /home/berntm/miniconda3 (writable)
channel URLs : https://conda.anaconda.org/anaconda/linux-64
https://conda.anaconda.org/anaconda/noarch
https://repo.anaconda.com/pkgs/main/linux-64
https://repo.anaconda.com/pkgs/main/noarch
https://repo.anaconda.com/pkgs/r/linux-64
https://repo.anaconda.com/pkgs/r/noarch
package cache : /home/berntm/miniconda3/pkgs
/home/berntm/.conda/pkgs
envs directories : /home/berntm/miniconda3/envs
/home/berntm/.conda/envs
platform : linux-64
user-agent : conda/4.8.3 requests/2.24.0 CPython/3.7.3 Linux/5.4.0-53-generic ubuntu/20.04.1 glibc/2.31
UID:GID : 1000:1000
netrc file : None
offline mode : False
Issue:
Since osx-64/mpich-3.2.1-h26a2512_6.tar.bz2
was uploaded to anaconda.org on 11/19/18, my conda recipes that use mpicc
have been broken. Apologies if this is a Travis issue.
To reproduce, run the following on Travis with the default osx image (MacOS 10.13 and Xcode 9.4.1).
$ conda create -n mpich_6 -c conda-forge mpich
$ source activate mpich_6
$ echo "int main() { return 0; }" > main.c
$ mpicc main.c
clang: warning: no such sysroot directory: '/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.9.sdk' [-Wmissing-sysroot]
ld: library not found for -lc++
clang: error: linker command failed with exit code 1 (use -v to see invocation)
If I use the previous build, osx64/mpich-3.2.1-h2612512_5
, everything works as expected:
$ conda create -n mpich_5 -c conda-forge mpich=3.2.1=h26a2512_5
$ source activate mpich_5
$ echo "int main() { return 0; }" > main.c
$ mpicc main.c
# success
conda list
):
$ conda list
# packages in environment at /Users/travis/miniconda/envs/mpich_6:
#
# Name Version Build Channel
libgfortran 3.0.0 1 conda-forge
mpi 1.0 mpich conda-forge
mpich 3.2.1 h26a2512_6 conda-forge
conda
and system ( conda info
):
$ conda info
active environment : mpich_6
active env location : /Users/travis/miniconda/envs/mpich_6
shell level : 1
user config file : /Users/travis/.condarc
populated config files : /Users/travis/.condarc
conda version : 4.5.11
conda-build version : 3.16.3
python version : 3.7.1.final.0
base environment : /Users/travis/miniconda (writable)
channel URLs : https://repo.anaconda.com/pkgs/main/osx-64
https://repo.anaconda.com/pkgs/main/noarch
https://repo.anaconda.com/pkgs/free/osx-64
https://repo.anaconda.com/pkgs/free/noarch
https://repo.anaconda.com/pkgs/r/osx-64
https://repo.anaconda.com/pkgs/r/noarch
https://repo.anaconda.com/pkgs/pro/osx-64
https://repo.anaconda.com/pkgs/pro/noarch
package cache : /Users/travis/miniconda/pkgs
/Users/travis/.conda/pkgs
envs directories : /Users/travis/miniconda/envs
/Users/travis/.conda/envs
platform : osx-64
user-agent : conda/4.5.11 requests/2.19.1 CPython/3.7.1 Darwin/17.4.0 OSX/10.13.3
UID:GID : 501:20
netrc file : None
offline mode : False
Some questions about paths and executables with the new compilers, prompted by #24 since MPI includes references to the compiler in the package and calls out to the compilers at runtime.
Now that we have conda-packaged compilers, there are two questions to answer for what $CC and friends should be:
x86_64-apple-darwin13.4.0-clang
or the 'public' name clang
(same goes for x86_64-conda_cos6-linux-gnu-cc
)abspath | basename | |
---|---|---|
host | $PREFIX/bin/x86_64-apple-darwin13.4.0-clang | x86_64-apple-darwin13.4.0-clang |
no host | $PREFIX/bin/clang | clang |
I'm not sure what the answer is for these in default behavior, but I'll illustrate how it affects MPI packages below. For packages that don't preserve references to how they were compiled, I don't believe either of these makes a difference. My inclination is to use the abspath+no host combination.
MPI providers are in a semi-unique situation in that they provide compiler wrappers mpicc
, mpifort
, etc., For the most part, this involves recording a reference to $CC, $CXX, $FC environment variables in the wrappers themselves.
The result is that the values of $CC, etc. at build time for mpi are relevant to downstream packages at runtime.
The first hiccup I ran into that prompted was [this one],
where builds succeeded but tests failed:
On linux, it failed immediately:
(build dir prefix truncated to $build
, _placehol...
stripped)
prefix=$build/_test_env --disable-dependency-tracking --enable-cxx --enable-fortran
MPICH CC: $build/_build_env/bin/x86_64-conda_cos6-linux-gnu-cc -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -pipe -I$build/_test_env/include -fdebug-prefix-map=${SRC_DIR}=/usr/local/src/conda/${PKG_NAME}-${PKG_VERSION} -fdebug-prefix-map=${PREFIX}=/usr/local/src/conda-prefix -O2
MPICH CXX: $build/_build_env/bin/x86_64-conda_cos6-linux-gnu-c++ -fvisibility-inlines-hidden -std=c++17 -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -pipe -I$build/_test_env/include -fdebug-prefix-map=${SRC_DIR}=/usr/local/src/conda/${PKG_NAME}-${PKG_VERSION} -fdebug-prefix-map=${PREFIX}=/usr/local/src/conda-prefix -O2
MPICH F77: $build/_build_env/bin/x86_64-conda_cos6-linux-gnu-gfortran -fopenmp -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -pipe -I$build/_test_env/include -fdebug-prefix-map=${SRC_DIR}=/usr/local/src/conda/${PKG_NAME}-${PKG_VERSION} -fdebug-prefix-map=${PREFIX}=/usr/local/src/conda-prefix -O2
MPICH FC: $build/_build_env/bin/x86_64-conda_cos6-linux-gnu-gfortran -O2
...
$build/_test_env/bin/mpicc: line 278: $build/_build_env/bin/x86_64-conda_cos6-linux-gnu-cc: No such file or directory
while on mac, C compilers succeeded, only fortran failed.
This is because on mac,
MPICH CC: x86_64-apple-darwin13.4.0-clang -march=core2 -mtune=haswell -mssse3 -ftree-vectorize -fPIC -fPIE -fstack-protector-strong -O2 -pipe -I$build/_test_env/include -fdebug-prefix-map=${SRC_DIR}=/usr/local/src/conda/${PKG_NAME}-${PKG_VERSION} -fdebug-prefix-map=${PREFIX}=/usr/local/src/conda-prefix -O2
MPICH CXX: x86_64-apple-darwin13.4.0-clang++ -march=core2 -mtune=haswell -mssse3 -ftree-vectorize -fPIC -fPIE -fstack-protector-strong -O2 -pipe -stdlib=libc++ -fvisibility-inlines-hidden -std=c++14 -fmessage-length=0 -I$build/_test_env/include -fdebug-prefix-map=${SRC_DIR}=/usr/local/src/conda/${PKG_NAME}-${PKG_VERSION} -fdebug-prefix-map=${PREFIX}=/usr/local/src/conda-prefix -O2
MPICH F77: $build/_build_env/bin/x86_64-apple-darwin13.4.0-gfortran -march=nocona -mtune=core2 -ftree-vectorize -fPIC -fstack-protector -O2 -pipe -I$build/_test_env/include -fdebug-prefix-map=${SRC_DIR}=/usr/local/src/conda/${PKG_NAME}-${PKG_VERSION} -fdebug-prefix-map=${PREFIX}=/usr/local/src/conda-prefix -O2
MPICH FC: $build/_build_env/bin/x86_64-apple-darwin13.4.0-gfortran -O2
...
$build/_test_env/bin/mpif77: line 369: $build/_build_env/bin/x86_64-apple-darwin13.4.0-gfortran: No such file or directory
bug in compiler packages
So the first issue, which I assume is a bug, is that the C compilers on mac use base name, while all other compilers use absolute paths. I assume these should be consistent, but I'm not sure which direction is right. Going with the majority would suggest that they should all use absolute paths. However, if all of the compilers used only their base name, like clang, the build would have succeeded without complaint.
This is failing for mpi packages because the compilers go in the build environment, so absolute paths don't get rewritten. I think the right thing to do for mpi in particular is actually to put compilers in the host environment. In my understanding, 'host' is the right place to put dependencies that the package may refer to, and mpi refers to the compilers that built it. It's not a runtime dependency, but it is often used in conjunction at runtime, so there should perhaps be appropriate restrictions when c compilers are requested in combination with mpi.
Absolute paths:
Conclusion: absolute paths seem like the right thing to use, and packages like mpi that include references to their compilers should signal this to conda by putting the compilers as host dependencies, not build dependencies.
Looking at these names also prompted me to think about having versions in the executable names. This leads to the question: should $CC be darwin13.4.0-clang
or just clang
? I don't know the answer, because I don't actually know what's the source of darwin13.4.0
or how/when it can change. It appears to come from conda-build's BUILD env, but I don't really know if/when/how that would be changed. If a change to that value means that a package is definitely incompatible, then the current behavior seems right. If updating that build would not break compatibility, then clang
is probably the right thing. In general, it seems like recording the 'public' name feels more correct, but I'd have to have a better understanding of what the host string really means in terms of compatibility. E.g. if libmpi is built with clang 4, but then a downstream library is built with clang 6, this is typically fine. The same goes for bumping base macos version - as long as downstream is always newer than upstream, it's typically safe.
A relevant example: Python uses basename to remove the env prefix from compilers, since it, too, records its compilers in order to build extensions, ultimately only recording gcc
for compilers, not the env or the host.
All that said, I think that just clang
(base name, no host) might end up getting the best results.
cc @conda-forge/core
Issue:
with clean env and then
conda install mpich
trying
mpirun
gives Segmentation fault: 11
conda list
):
# Name Version Build Channel
libcxx 11.1.0 habf9029_0 conda-forge
libgfortran 5.0.0 9_3_0_h6c81a4c_22 conda-forge
libgfortran5 9.3.0 h6c81a4c_22 conda-forge
llvm-openmp 11.1.0 hda6cdc1_1 conda-forge
mpi 1.0 mpich conda-forge
mpich 3.4.1 hd33e60e_104 conda-forge
conda
and system ( conda info
):
active environment : mpichtest
active env location : /Users/brey/miniconda3/envs/mpichtest
shell level : 2
user config file : /Users/brey/.condarc
populated config files : /Users/brey/.condarc
conda version : 4.10.0
conda-build version : 3.21.4
python version : 3.8.8.final.0
virtual packages : __osx=10.15.7=0
__unix=0=0
__archspec=1=x86_64
base environment : /Users/brey/miniconda3 (writable)
conda av data dir : /Users/brey/miniconda3/etc/conda
conda av metadata url : https://repo.anaconda.com/pkgs/main
channel URLs : https://conda.anaconda.org/conda-forge/osx-64
https://conda.anaconda.org/conda-forge/noarch
https://repo.anaconda.com/pkgs/main/osx-64
https://repo.anaconda.com/pkgs/main/noarch
https://repo.anaconda.com/pkgs/r/osx-64
https://repo.anaconda.com/pkgs/r/noarch
package cache : /Users/brey/miniconda3/pkgs
/Users/brey/.conda/pkgs
envs directories : /Users/brey/miniconda3/envs
/Users/brey/.conda/envs
platform : osx-64
user-agent : conda/4.10.0 requests/2.25.1 CPython/3.8.8 Darwin/19.6.0 OSX/10.15.7
UID:GID : 501:20
netrc file : None
offline mode : False
Some dynamic libraries have
/Users/ray/...
I can't link with this path...
./lib/libfmpich.dylib:
...
/Users/ray/mc-x64-3.5/conda-bld/gcc-4.8_1477649012852/_b_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_plac/lib/libgcc_s.1.dylib (compatibility version 1.0.0, current version 1.0.0)
./lib/libmpi.12.dylib:
/Users/ray/mc-x64-3.5/conda-bld/gcc-4.8_1477649012852/_b_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_plac/lib/libgcc_s.1.dylib (compatibility version 1.0.0, c
...
Fortran 2008 interface (mpi_f08
) not available in the Conda packaged version of MPICH.
Trying to compile the following code:
program test
use mpi_f08
end program test
with
mpifort test.f90
results in
test.f90:2:7:
2 | use mpi_f08
| 1
Fatal Error: Cannot open module file 'mpi_f08.mod' for reading at (1): No such file or directory
compilation terminated.
The root cause could be eventually conda-forge/ctng-compilers-feedstock#122, but not sure, how to change the situation. (Due to this issue, I unfortunately will have to drop the MPICH-support of my software package on Conda and offer only OpenMPI, which supports the F08-interface on Conda.)
# packages in environment at /home/aradi/opt/miniconda3/envs/mpich-test:
#
# Name Version Build Channel
_libgcc_mutex 0.1 conda_forge conda-forge
_openmp_mutex 4.5 2_gnu conda-forge
_sysroot_linux-64_curr_repodata_hack 3 h69a702a_13 conda-forge
binutils_impl_linux-64 2.40 hf600244_0 conda-forge
gcc 13.2.0 h574f8da_2 conda-forge
gcc_impl_linux-64 13.2.0 h338b0a0_3 conda-forge
gfortran 13.2.0 h0584b13_2 conda-forge
gfortran_impl_linux-64 13.2.0 h76e1118_3 conda-forge
kernel-headers_linux-64 4.18.0 he073ed8_0 conda-forge
ld_impl_linux-64 2.40 h41732ed_0 conda-forge
libgcc-devel_linux-64 13.2.0 ha9c7c90_103 conda-forge
libgcc-ng 13.2.0 h807b86a_3 conda-forge
libgfortran-ng 13.2.0 h69a702a_3 conda-forge
libgfortran5 13.2.0 ha4646dd_3 conda-forge
libgomp 13.2.0 h807b86a_3 conda-forge
libsanitizer 13.2.0 h7e041cc_3 conda-forge
libstdcxx-ng 13.2.0 h7e041cc_3 conda-forge
mpi 1.0 mpich conda-forge
mpich 4.1.2 h846660c_102 conda-forge
sysroot_linux-64 2.28 he073ed8_0 conda-forge
active environment : mpich-test
active env location : /home/aradi/opt/miniconda3/envs/mpich-test
shell level : 2
user config file : /home/aradi/.condarc
populated config files : /home/aradi/.condarc
conda version : 23.11.0
conda-build version : not installed
python version : 3.9.16.final.0
solver : libmamba (default)
virtual packages : __archspec=1=skylake
__conda=23.11.0=0
__glibc=2.35=0
__linux=5.18.19=0
__unix=0=0
base environment : /home/aradi/opt/miniconda3 (writable)
conda av data dir : /home/aradi/opt/miniconda3/etc/conda
conda av metadata url : None
channel URLs : https://conda.anaconda.org/conda-forge/linux-64
https://conda.anaconda.org/conda-forge/noarch
https://repo.anaconda.com/pkgs/main/linux-64
https://repo.anaconda.com/pkgs/main/noarch
https://repo.anaconda.com/pkgs/r/linux-64
https://repo.anaconda.com/pkgs/r/noarch
package cache : /home/aradi/opt/miniconda3/pkgs
/home/aradi/.conda/pkgs
envs directories : /home/aradi/opt/miniconda3/envs
/home/aradi/.conda/envs
platform : linux-64
user-agent : conda/23.11.0 requests/2.31.0 CPython/3.9.16 Linux/5.18.19-051819-generic ubuntu/22.04.3 glibc/2.35 solver/libmamba conda-libmamba-solver/23.11.1 libmambapy/1.5.4
UID:GID : 1000:1000
netrc file : None
offline mode : False
Issue: When I create a conda environment with the latest mpich
package I get an error on Ubuntu 16.04 when trying to run mpic++
or mpicc
:
$ conda create -n mpi -c conda-forge mpich
$ source activate mpi
$ which mpic++
/home/chris/miniconda3/envs/mpi/bin/mpic++
$ mpic++ -v
mpicxx for MPICH version 3.2.1
/home/chris/miniconda3/envs/mpi/bin/mpic++: line 280: x86_64-conda_cos6-linux-gnu-c++: command not found
$ cat test.cpp
int main() { return 0; }
$ mpic++ test.cpp
/home/chris/miniconda3/envs/mpi/bin/mpic++: line 276: x86_64-conda_cos6-linux-gnu-c++: command not found
conda list
):
$ conda list
# packages in environment at /home/chris/miniconda3/envs/mpi:
#
# Name Version Build Channel
libgcc-ng 7.3.0 hdf63c60_0 conda-forge
libgfortran-ng 7.2.0 hdf63c60_3 conda-forge
libstdcxx-ng 7.3.0 hdf63c60_0 conda-forge
mpi 1.0 mpich conda-forge
mpich 3.2.1 h1c2f66e_1007 conda-forge
conda
and system ( conda info
):
$ conda info
active environment : mpi
active env location : /home/chris/miniconda3/envs/mpi
shell level : 1
user config file : /home/chris/.condarc
populated config files :
conda version : 4.5.11
conda-build version : not installed
python version : 3.6.3.final.0
base environment : /home/chris/miniconda3 (writable)
channel URLs : https://repo.anaconda.com/pkgs/main/linux-64
https://repo.anaconda.com/pkgs/main/noarch
https://repo.anaconda.com/pkgs/free/linux-64
https://repo.anaconda.com/pkgs/free/noarch
https://repo.anaconda.com/pkgs/r/linux-64
https://repo.anaconda.com/pkgs/r/noarch
https://repo.anaconda.com/pkgs/pro/linux-64
https://repo.anaconda.com/pkgs/pro/noarch
package cache : /home/chris/miniconda3/pkgs
/home/chris/.conda/pkgs
envs directories : /home/chris/miniconda3/envs
/home/chris/.conda/envs
platform : linux-64
user-agent : conda/4.5.11 requests/2.18.4 CPython/3.6.3 Linux/4.15.0-43-generic ubuntu/16.04 glibc/2.23
UID:GID : 1000:1000
netrc file : None
offline mode : False
I tried MPICH both from PyPI (mpi4py-mpich) and from conda-forge. MPICH from conda-forge is slower enough against MPICH from PyPI. As far as I know, MPICH supports XPMEM for 1-copy interprocess communication, which should be enabled by default. Might be that the case that MPICH has a build issue in conda-forge similar to one OpenMPI has regarding CMA in conda-forge/openmpi-feedstock#118?
cc @dalcinl
_libgcc_mutex 0.1 main
_openmp_mutex 5.1 1_gnu
ca-certificates 2022.12.7 ha878542_0 conda-forge
certifi 2022.12.7 pyhd8ed1ab_0 conda-forge
ld_impl_linux-64 2.38 h1181459_1
libffi 3.4.2 h6a678d5_6
libgcc-ng 11.2.0 h1234567_1
libgfortran-ng 7.5.0 h14aa051_20 conda-forge
libgfortran4 7.5.0 h14aa051_20 conda-forge
libgomp 11.2.0 h1234567_1
libstdcxx-ng 11.2.0 h1234567_1
mpi 1.0 mpich conda-forge
mpi4py 3.1.4 py38hfc96bbd_0
mpich 3.3.2 hc856adb_0
ncurses 6.4 h6a678d5_0
openssl 1.1.1t h7f8727e_0
pip 23.0.1 py38h06a4308_0
python 3.8.16 h7a1cb2a_3
readline 8.2 h5eee18b_0
setuptools 65.6.3 py38h06a4308_0
sqlite 3.41.2 h5eee18b_0
tk 8.6.12 h1ccaba5_0
wheel 0.38.4 py38h06a4308_0
xz 5.2.10 h5eee18b_1
zlib 1.2.13 h5eee18b_0
active environment : mpich
active env location : $CONDA_PATH/envs/mpich
shell level : 1
user config file : $HOME_PATH/.condarc
populated config files :
conda version : 4.12.0
conda-build version : not installed
python version : 3.8.13.final.0
virtual packages : __cuda=12.0=0
__linux=5.4.0=0
__glibc=2.31=0
__unix=0=0
__archspec=1=x86_64
base environment : $CONDA_PATH (writable)
conda av data dir : $CONDA_PATH/etc/conda
conda av metadata url : None
channel URLs : https://repo.anaconda.com/pkgs/main/linux-64
https://repo.anaconda.com/pkgs/main/noarch
https://repo.anaconda.com/pkgs/r/linux-64
https://repo.anaconda.com/pkgs/r/noarch
package cache : $CONDA_PATH/pkgs
$HOME_PATH/.conda/pkgs
envs directories : $CONDA_PATH/envs
$HOME_PATH/.conda/envs
platform : linux-64
user-agent : conda/4.12.0 requests/2.27.1 CPython/3.8.13 Linux/5.4.0-136-generic ubuntu/20.04.4 glibc/2.31
UID:GID : 11815093:22002
netrc file : None
offline mode : False
Issue:
One of my recipes started failing when building with MPICH and OSX. The previous build worked out fine so something probably changed along the way and started resulting into errors.
This is the most recent build: https://dev.azure.com/conda-forge/feedstock-builds/_build/results?buildId=270799&view=logs&j=b4588902-138a-5967-ecc7-b3fc381bfda2&t=5a7a20e7-b634-5369-ebb8-6b51f51eb32a&l=1152
The previous one that worked is: https://dev.azure.com/conda-forge/feedstock-builds/_build/results?buildId=230160&view=results
I tracked down to be related to the new mpich == 3.4. Downgrading to 3.3.2 seems to solve the problem and produce a good build.
conda list
):
conda
and system ( conda info
):
Issue:
conda list
):
conda
and system ( conda info
):
$ conda info
active environment : test_conda
active env location : /home/wddawson/miniconda3/envs/test_conda
shell level : 2
user config file : /home/wddawson/.condarc
populated config files :
conda version : 4.10.3
conda-build version : not installed
python version : 3.7.6.final.0
virtual packages : __linux=4.4.0=0
__glibc=2.31=0
__unix=0=0
__archspec=1=x86_64
base environment : /home/wddawson/miniconda3 (writable)
conda av data dir : /home/wddawson/miniconda3/etc/conda
conda av metadata url : None
channel URLs : https://repo.anaconda.com/pkgs/main/linux-64
https://repo.anaconda.com/pkgs/main/noarch
https://repo.anaconda.com/pkgs/r/linux-64
https://repo.anaconda.com/pkgs/r/noarch
package cache : /home/wddawson/miniconda3/pkgs
/home/wddawson/.conda/pkgs
envs directories : /home/wddawson/miniconda3/envs
/home/wddawson/.conda/envs
platform : linux-64
user-agent : conda/4.10.3 requests/2.26.0 CPython/3.7.6 Linux/4.4.0-19041-Microsoft ubuntu/20.04.3 glibc/2.31
UID:GID : 1000:1000
netrc file : None
offline mode : False
An environment can be setup like this (environment.yml):
name: test_conda
channels:
- conda-forge
dependencies:
- compilers
- mpich
conda env create -f environment.yml
conda activate test_conda
I created a Fortran subroutine that might be called from C++ (flib.f90):
MODULE FLIB
CONTAINS
SUBROUTINE MyRoutine() BIND(C, name="MyRoutine")
USE MPI
IMPLICIT NONE
INTEGER :: foo, err
foo = 1
WRITE(*,*) ">>", foo
CALL MPI_Allreduce(MPI_IN_PLACE, foo, 1, &
& MPI_INT, MPI_SUM, MPI_COMM_WORLD, err)
WRITE(*,*) "<<", foo
END SUBROUTINE MyRoutine
END MODULE
It can be driven from Fortran (fdriv.f90):
PROGRAM FDriv
USE FLIB
IMPLICIT NONE
INTEGER :: err
CALL MPI_Init(err)
CALL MyRoutine()
CALL MPI_Finalize(err)
END PROGRAM
And driven from C++ (cdriv.cc):
#include <mpi.h>
#include <iostream>
extern "C" {
void MyRoutine();
}
void CVersion() {
int foo;
foo = 1;
MPI_Allreduce(MPI_IN_PLACE, &foo, 1, MPI_INT, MPI_SUM, MPI_COMM_WORLD);
std::cout << " C >> " << foo << std::endl;
}
int main(int argc, char *argv[]) {
MPI_Init(&argc, &argv);
MyRoutine();
CVersion();
MPI_Finalize();
return 0;
}
Here is the Makefile:
ALL: fort cxx
flib.o: flib.f90
mpif90 -c $< -o $@
fdriv.o: fdriv.f90 flib.o
mpif90 -c $< -o $@
fort: fdriv.o flib.o
mpif90 $^ -o $@
cdriv.o: cdriv.cc
mpicxx -c $< -o $@ -lstdc++
cxx: cdriv.o flib.o
mpif90 $^ -o $@
test:
mpirun -np 1 ./fort
mpirun -np 1 ./cxx
clean:
rm *.o *.mod fort cxx
When I run the code I get the following output:
mpirun -np 1 ./fort
>> 1
<< 1
mpirun -np 1 ./cxx
>> 1
<< 0
C >> 1
The 0
result is incorrect, it should be 1
.
Note that if I switch mpich
for openmpi
, the code operates correctly. The code also operates correctly if I use the MPICH which is installed via apt.
As we do one-time builds, it may be good to use ./configure --disable-dependency-tracking ...
https://www.gnu.org/software/automake/manual/html_node/Dependency-Tracking.html
I try to build a package which uses the mpic++
command to compile:
conda-forge/n2p2-feedstock#15
For Python 3.10 everything works fine but for Python < 3.10 the builds fail. The reason is that rather than installing mpich=4.0.2=h846660c_100
the package mpich=4.0.2=external_0
is installed which does not provide the corresponding executable.
While working on conda-forge/mumps-feedstock#102, I got as far as running the tests, but they start to crash with:
Abort(123809283) on node 1 (rank 1 in comm 0): Fatal error in internal_Bcast: Invalid datatype, error stack:
internal_Bcast(107): MPI_Bcast(buffer=0x16f87ab74, count=1, MPI_DATATYPE_NULL, 0, comm=0x84000001) failed
internal_Bcast(67).: Datatype for argument datatype is a null datatype
I looked through mumps source for MPI_BCAST calls, and found some with MPI_LOGICAL. I then checked mpif.h, which has:
PARAMETER (MPI_LOGICAL=MPI_DATATYPE_NULL)
suggesting that MPI_LOGICAL is unavailable. That doesn't seem right, but I don't know enough about mpich for how it determines when these should be available. The same is true for linux-aarch64, which is also cross-compiled.
The osx-64 package has:
PARAMETER (MPI_LOGICAL=1275069469)
So this suggests something is amiss in the cross-compiled mpich builds.
# packages in environment at /Users/minrk/conda/conda-bld/mumps_1702982356640/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_plac:
#
# Name Version Build Channel
bzip2 1.0.8 h93a5062_5 conda-forge
ca-certificates 2023.11.17 hf0a4a13_0 conda-forge
cctools_osx-arm64 973.0.1 h62378fb_15 conda-forge
clang 16.0.6 haab561b_3 conda-forge
clang-16 16.0.6 default_hd209bcb_3 conda-forge
clang_impl_osx-arm64 16.0.6 hc421ffc_7 conda-forge
clang_osx-arm64 16.0.6 h54d7cd3_7 conda-forge
clangxx 16.0.6 default_h5c94ee4_3 conda-forge
compiler-rt 16.0.6 h3808999_2 conda-forge
compiler-rt_osx-arm64 16.0.6 h3808999_2 conda-forge
gfortran_impl_osx-arm64 12.3.0 hbbb9e1e_1 conda-forge
gfortran_osx-arm64 12.3.0 h57527a5_1 conda-forge
gmp 6.3.0 h965bd2d_0 conda-forge
icu 73.2 hc8870d7_0 conda-forge
isl 0.25 h9a09cb3_0 conda-forge
ld64_osx-arm64 609 ha4bd21c_15 conda-forge
libblas 3.9.0 20_osxarm64_openblas conda-forge
libclang-cpp16 16.0.6 default_hd209bcb_3 conda-forge
libcxx 16.0.6 h4653b0c_0 conda-forge
libgfortran 5.0.0 13_2_0_hd922786_1 conda-forge
libgfortran-devel_osx-arm64 12.3.0 hc62be1c_1 conda-forge
libgfortran5 13.2.0 hf226fd6_1 conda-forge
libiconv 1.17 h0d3ecfb_2 conda-forge
liblapack 3.9.0 20_osxarm64_openblas conda-forge
libllvm16 16.0.6 haab561b_3 conda-forge
libopenblas 0.3.25 openmp_h6c19121_0 conda-forge
libptscotch 7.0.4 h5340af2_1 conda-forge
libscotch 7.0.4 hc938e73_1 conda-forge
libxml2 2.12.3 h0d0cfa8_0 conda-forge
libzlib 1.2.13 h53f4e23_5 conda-forge
llvm-openmp 17.0.6 hcd81f8e_0 conda-forge
llvm-tools 16.0.6 haab561b_3 conda-forge
metis 5.1.1 h965bd2d_2 conda-forge
mpc 1.3.1 h91ba8db_0 conda-forge
mpfr 4.2.1 h9546428_0 conda-forge
mpi 1.0 mpich conda-forge
mpich 4.1.2 hd4b5bf3_100 conda-forge
mumps-include 5.6.2 hce30654_1 local
mumps-mpi 5.6.2 h614c46f_1 local
openssl 3.2.0 h0d3ecfb_1 conda-forge
parmetis 4.0.3 hefa2a9d_1005 conda-forge
ptscotch 7.0.4 heaa5b5c_1 conda-forge
scalapack 2.2.0 hb170938_1 conda-forge
scotch 7.0.4 heaa5b5c_1 conda-forge
sigtool 0.1.3 h44b9a77_0 conda-forge
tapi 1100.0.11 he4954df_0 conda-forge
xz 5.2.6 h57fd34a_0 conda-forge
zlib 1.2.13 h53f4e23_5 conda-forge
zstd 1.5.5 h4f39d0f_0 conda-forge
mamba version : 1.5.4
active environment : None
shell level : 0
user config file : /Users/minrk/.condarc
populated config files : /Users/minrk/conda/.condarc
/Users/minrk/.condarc
conda version : 23.11.0
conda-build version : 3.28.1
python version : 3.10.13.final.0
solver : libmamba (default)
virtual packages : __archspec=1=m1
__conda=23.11.0=0
__osx=14.1.2=0
__unix=0=0
base environment : /Users/minrk/conda (writable)
conda av data dir : /Users/minrk/conda/etc/conda
conda av metadata url : None
channel URLs : https://conda.anaconda.org/conda-forge/osx-arm64
https://conda.anaconda.org/conda-forge/noarch
package cache : /Users/minrk/conda/pkgs
/Users/minrk/.conda/pkgs
envs directories : /Users/minrk/conda/envs
/Users/minrk/.conda/envs
platform : osx-arm64
user-agent : conda/23.11.0 requests/2.31.0 CPython/3.10.13 Darwin/23.1.0 OSX/14.1.2 solver/libmamba conda-libmamba-solver/23.11.1 libmambapy/1.5.4
UID:GID : 501:20
netrc file : /Users/minrk/.netrc
offline mode : False
Similar to openmpi, the mamba solver is pulling in the external variants first. We should deploy the same fix here, namely making sure the external packages have the same deps as the real ones to help the solver.
I have also made a virtual package https://github.com/regro/conda-forge-conda-plugins with the mpich version so that we can try that out.
Issue:
We are seeing mpich
build fails on netcdf4-feedstock
with recent changes: conda-forge/netcdf4-feedstock#112
It appears that the host environment is picking up the external
build of the mpich
package instead of the conda version by default.
I have found that I can get the build to work if I make mpich-mpicc
as a dependency as well. Before I do this, I just wanted to make sure this is the suggested solution. It might be helpful to add some documentation for how packages with mpich
as a dependency should handle this. As far as I could tell, the existing documentation only tell user how to explicitly get the external version, not how to explicitly request the conda version: https://conda-forge.org/docs/user/tipsandtricks.html#using-external-message-passing-interface-mpi-libraries. If there is already documentation on how to do this, I apologize for not finding it.
Hello, are there any plans to release the newest mpich 4.2.0 ?
Thanks.
Please remove recipe/tests/hellow_{c|f|f90}
from the repository. I guess these binary files were committed by accident.
Issue:
I have a recipe that depends on mpich
and also uses compiler('fortran')
. On MacOS, the fortran compiler seems to be pinned at version 4
, while mpich
requires version 3
:
conda.exceptions.UnsatisfiableError: The following specifications were found to be in conflict:
- libgfortran[version='>=4.0.0,<5.0.0.a0']
- mpich=3.2 -> libgfortran[version='>=3.0.1,<4.0.0.a0']
Hopefully just a rerender is needed.
@conda-forge-admin, please rerender
Environment (conda list
):
$ conda list
N/A
conda
and system ( conda info
):
$ conda info
active environment : base
active env location : /Users/travis/miniconda
shell level : 1
user config file : /Users/travis/.condarc
populated config files : /Users/travis/.condarc
conda version : 4.6.3
conda-build version : 3.17.8
python version : 3.7.1.final.0
base environment : /Users/travis/miniconda (writable)
channel URLs : https://repo.anaconda.com/pkgs/main/osx-64
https://repo.anaconda.com/pkgs/main/noarch
https://repo.anaconda.com/pkgs/free/osx-64
https://repo.anaconda.com/pkgs/free/noarch
https://repo.anaconda.com/pkgs/r/osx-64
https://repo.anaconda.com/pkgs/r/noarch
package cache : /Users/travis/miniconda/pkgs
/Users/travis/.conda/pkgs
envs directories : /Users/travis/miniconda/envs
/Users/travis/.conda/envs
platform : osx-64
user-agent : conda/4.6.3 requests/2.21.0 CPython/3.7.1 Darwin/17.4.0 OSX/10.13.3
UID:GID : 501:20
netrc file : None
offline mode : False
@sisivy I believe your fix is unfortunately incorrect. Not your fault, though: the output of ./configure --help
is wrong, look what's in configure.ac
:
...
AC_ARG_WITH([wrapper-dl-type],
[AS_HELP_STRING([--enable-wrapper-dl-type],
[Dynamic loading model for alternate MPI
...
LD_LIBRARY_PATH)])],
[],[with_wrapper_dl_type=runpath])
AC_SUBST([with_wrapper_dl_type])
...
Therefore, the proper option and and value to ./configure
should be --with-wrapper-dl-type=none
.
Do not submit a new PR just yet, I'll try to sneak a fix in some upcoming cleanup I'm planning
PS: Issue submitted upstream: pmodels/mpich#6889
Originally posted by @dalcinl in #87 (comment)
With mpich 4.2.2, when using in a new conda environment, including on github actions, an
undefined reference to GLIC 2.14 is reported. Even if a more recent version of glibc is present.
I don't know if there is a recommended way to ensure glibc matches in Conda, but this has never been an issue with previous versions (including mpich 4.2.1).
Steps to reproduce:
conda create -n testmpi mpich gcc_linux-64
export PYTHONNOUSERSITE=1
. activate testmpi
mpicc helloworld_mpi.c
... libmpi.so: undefined reference to `memcpy@GLIBC_2.14'
where helloworld_mpi.c is:
#include <stdio.h>
#include <mpi.h>
int main(int argc, char **argv)
{
int rank;
MPI_Init(&argc,&argv);
MPI_Comm_rank(MPI_COMM_WORLD, &rank);
printf("Hello World from Rank %d\n",rank);
MPI_Finalize();
}
Note I have conda-forge channel.
conda config --show channels
channels:
- conda-forge
- defaults
# packages in environment at /home/shudson/miniconda3/envs/testmpi:
#
# Name Version Build Channel
_libgcc_mutex 0.1 conda_forge conda-forge
_openmp_mutex 4.5 2_gnu conda-forge
binutils_impl_linux-64 2.38 h2a08ee3_1
binutils_linux-64 2.38.0 hc2dff05_0
gcc_impl_linux-64 11.2.0 h1234567_1
gcc_linux-64 11.2.0 h5c386dc_0
kernel-headers_linux-64 2.6.32 he073ed8_17 conda-forge
ld_impl_linux-64 2.38 h1181459_1
libgcc-devel_linux-64 11.2.0 h1234567_1
libgcc-ng 14.1.0 h77fa898_0 conda-forge
libgfortran-ng 14.1.0 h69a702a_0 conda-forge
libgfortran5 14.1.0 hc5f4f2c_0 conda-forge
libgomp 14.1.0 h77fa898_0 conda-forge
libstdcxx-ng 14.1.0 hc0a3c3a_0 conda-forge
mpi 1.0 mpich conda-forge
mpich 4.2.2 h4a7f18d_100 conda-forge
sysroot_linux-64 2.12 he073ed8_17 conda-forge
conda info
active environment : testmpi
active env location : /home/shudson/miniconda3/envs/testmpi
shell level : 1
user config file : /home/shudson/.condarc
populated config files : /home/shudson/.condarc
conda version : 4.10.3
conda-build version : not installed
python version : 3.9.5.final.0
virtual packages : __linux=5.15.0=0
__glibc=2.31=0
__unix=0=0
__archspec=1=x86_64
base environment : /home/shudson/miniconda3 (writable)
conda av data dir : /home/shudson/miniconda3/etc/conda
conda av metadata url : None
channel URLs : https://conda.anaconda.org/conda-forge/linux-64
https://conda.anaconda.org/conda-forge/noarch
https://repo.anaconda.com/pkgs/main/linux-64
https://repo.anaconda.com/pkgs/main/noarch
https://repo.anaconda.com/pkgs/r/linux-64
https://repo.anaconda.com/pkgs/r/noarch
package cache : /home/shudson/miniconda3/pkgs
/home/shudson/.conda/pkgs
envs directories : /home/shudson/miniconda3/envs
/home/shudson/.conda/envs
platform : linux-64
user-agent : conda/4.10.3 requests/2.28.2 CPython/3.9.5 Linux/5.15.0-113-generic ubuntu/20.04.6 glibc/2.31
UID:GID : 1000:1000
netrc file : /home/shudson/.netrc
offline mode : False
Hello!
I'm trying to use shared memory with spawned processes but I'm having problems using mpich - the child process's shared memory is different from the master process memory.
I have prepared the simple reproducer.
Please save it as mpich_problem.py
and run it like this:
mpiexec -n 1 python mpich_problem.py
It works with mpi4py on openmpi=4.1.5 but does not work with mpi4py on mpich:
import sys
import pickle
from mpi4py import MPI
########################
# Threads initializing #
########################
comm = MPI.COMM_WORLD
rank = comm.Get_rank()
size = comm.Get_size()
parent_comm = MPI.Comm.Get_parent()
if rank == 0 and parent_comm == MPI.COMM_NULL and size == 1:
nprocs_to_spawn = 3
args = ["mpich_problem.py"]
info = MPI.Info.Create()
intercomm = MPI.COMM_SELF.Spawn(
sys.executable,
args,
maxprocs=nprocs_to_spawn,
info=info,
root=rank,
)
comm = intercomm.Merge(high=False)
if parent_comm != MPI.COMM_NULL:
comm = parent_comm.Merge(high=True)
rank = comm.Get_rank()
size = comm.Get_size()
##############################
# Shared memory initializing #
##############################
# I am going to use non-contiguous shared memory, but it's not necessary.
# The problem is the same as in the case of using contiguous shared memory.
info = MPI.Info.Create()
info.Set("alloc_shared_noncontig", "true")
win = MPI.Win.Allocate_shared(5000, MPI.BYTE.size, comm=comm, info=info)
buf, itemsize = win.Shared_query(0)
##########################
# Threads communications #
##########################
# Rank #0 creates the data, serializes it, and then puts it into shared memory.
if rank == 0:
data = list(range(1000))
s_data = pickle.dumps(data)
buf[: len(s_data)] = s_data
for i in range(1, 4):
comm.send(len(s_data), dest=i)
comm.Send(s_data, dest=i)
# The other ranks get length of the serialized data and try to deserialize it.
else:
size = comm.recv(source=0)
expected_buf = bytearray(size)
comm.Recv(expected_buf, source=0)
assert buf[:size] == expected_buf, "Shared buffer is not equal to expected buffer"
obj = pickle.loads(buf[:size])
print(f"{rank}: {len(obj)}")
##############
# Finish MPI #
##############
if not MPI.Is_finalized():
MPI.Finalize()
Could you fix this problem?
# packages in environment at $USER_PATH/miniconda3/envs/mpi4py_mpich:
#
# Name Version Build Channel
_libgcc_mutex 0.1 conda_forge conda-forge
_openmp_mutex 4.5 2_gnu conda-forge
bzip2 1.0.8 h7f98852_4 conda-forge
ca-certificates 2022.12.7 ha878542_0 conda-forge
ld_impl_linux-64 2.40 h41732ed_0 conda-forge
libffi 3.4.2 h7f98852_5 conda-forge
libgcc-ng 12.2.0 h65d4601_19 conda-forge
libgfortran-ng 12.2.0 h69a702a_19 conda-forge
libgfortran5 12.2.0 h337968e_19 conda-forge
libgomp 12.2.0 h65d4601_19 conda-forge
libnsl 2.0.0 h7f98852_0 conda-forge
libsqlite 3.40.0 h753d276_0 conda-forge
libstdcxx-ng 12.2.0 h46fd767_19 conda-forge
libuuid 2.38.1 h0b41bf4_0 conda-forge
libzlib 1.2.13 h166bdaf_4 conda-forge
mpi 1.0 mpich conda-forge
mpi4py 3.1.4 py39h32b9844_0 conda-forge
mpich 4.0.3 h846660c_100 conda-forge
ncurses 6.3 h27087fc_1 conda-forge
openssl 3.1.0 h0b41bf4_0 conda-forge
pip 23.0.1 pyhd8ed1ab_0 conda-forge
python 3.9.16 h2782a2a_0_cpython conda-forge
python_abi 3.9 3_cp39 conda-forge
readline 8.2 h8228510_1 conda-forge
setuptools 67.6.1 pyhd8ed1ab_0 conda-forge
tk 8.6.12 h27826a3_0 conda-forge
tzdata 2023c h71feb2d_0 conda-forge
wheel 0.40.0 pyhd8ed1ab_0 conda-forge
xz 5.2.6 h166bdaf_0 conda-forge
active environment : mpi4py_mpich
active env location : $USER_PATH/miniconda3/envs/mpi4py_mpich
shell level : 2
user config file : $USER_PATH/.condarc
populated config files :
conda version : 23.1.0
conda-build version : not installed
python version : 3.9.16.final.0
virtual packages : __archspec=1=x86_64
__glibc=2.35=0
__linux=5.15.0=0
__unix=0=0
base environment : $USER_PATH/miniconda3 (writable)
conda av data dir : $USER_PATH/miniconda3/etc/conda
conda av metadata url : None
channel URLs : https://repo.anaconda.com/pkgs/main/linux-64
https://repo.anaconda.com/pkgs/main/noarch
https://repo.anaconda.com/pkgs/r/linux-64
https://repo.anaconda.com/pkgs/r/noarch
package cache : $USER_PATH/miniconda3/pkgs
$USER_PATH/.conda/pkgs
envs directories : $USER_PATH/miniconda3/envs
$USER_PATH/.conda/envs
platform : linux-64
user-agent : conda/23.1.0 requests/2.28.1 CPython/3.9.16 Linux/5.15.0-67-generic ubuntu/22.04.1 glibc/2.35
UID:GID : 1002:1002
netrc file : None
offline mode : False
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.