metwork-framework / resources Goto Github PK
View Code? Open in Web Editor NEWcommon resources for metwork-framework organization
Home Page: http://www.metwork-framework.org
License: BSD 3-Clause "New" or "Revised" License
common resources for metwork-framework organization
Home Page: http://www.metwork-framework.org
License: BSD 3-Clause "New" or "Revised" License
For the moment, we only build a master branch (and integration branch which is kind of "pre-master"). We also build "PRs" before acceptance which is great.
Now, we need to prepare our build system to releases branches.
guess_version.sh
script for release_*
branches.drone.yml
files with cookiecutter in resources repository.drone.yml
filesThen:
https://medium.com/mergify/announcing-the-new-mergify-engine-7bb2eceeb778
probably breaks existing branch protection
probably with a sum of processes filtered like it's done in kill_remaining_processes.py
or something like that
depending on the integration-level of each repo, various configurations should be forced:
genhdlist2
When this test is run, nothing is explicitely done to load the tested layer.
So the test can fail on a library which is provided by the layer itself or by one of his dependencies.
This usually does not happen because many layers are optionally loaded as dependencies of default@mfext
(which is loaded when the test is run).
But we don't want to load by default every layers from every available addon.
So it would be a good thing to change the test to make sure that the tested layer is loaded and avoid false alarms on extra dependencies.
for example:
we should execute also integrations tests on the corresponding PR
warning: not an easy task (with drone)
New feature:
(see CONTRIBUTING.md guide)
ping @thebaptiste
so it's difficult, error-prone to rename a plugin:
this is also blocking the "plugins.swap" feature on mfserv
I think we want to keep the .layerapi2_label file content as a reference.
So we have to:
opinionated_configparser
(use envtpl
)opinionated_configparser
in mfext
(python3)opinionated_configparser
in mfext
(python2)__ini_to_env
using itopinionated_configparser
mfext
:
config.py
layers/layer9_python3_misc/0100_mfutil/mfutil/plugins.py
use_envtpl
mfadmin
mfsysmon
mfbase
:
adm/__make_nginx_conf
use_envtpl
mfserv
:
adm/__make_nginx_conf
adm/_plugins.is_dangerous
adm/_make_circus_conf
use_envtpl
mfdata
:
plugins/switch/main.py
layers/layer1_python3/0300_acquisition/acquisition/configargparse_confparser.py
layers/layer1_python3/0100_directory_observer/requirements.txt
layers/layer1_python3/0100_directory_observer/directory_observer/directory_observer.py
adm/_make_directory_observer_conf
adm/_make_switch_conf
adm/_make_circus_conf
adm/before_start_directory_observer
use_envtpl
__ini_to_env
configparser_extended
from the private repository
/etc/rc.d/init.d/metwork start
[...]
System: disabling transparent_hugepage... /etc/rc.d/init.d/metwork: line 128: /sys/kernel/mm/transparent_hugepage/enabled: Read-only file system
ERROR
To take an example with mfserv:
circus
, the code is run inside the plugin_env but we add the plugin_dir to PYTHONPATH
(we can change this but this is the default configuration) and in some cases (not configured by default), we can also add app_dir to PYTHONPATH
plugin_wrapper
(cron, continuous integration...), we don't add plugin_dir or app_dir to PYTHONPATH
plugin_env
, we don't add plugin_dir or app_dir to PYTHONPATH
We have to define a unique way to do all these things. The main entry points should be plugin_wrapper
. We have to rewrite it to add some options. Then, plugin_env
and _make_circus_conf
should use it and not write a specific code.
plugin_wrapper
with an option (true by default) to add current plugin dir to pythonpath and an option (false by default) to add an app dir to pythonpath_make_circus_conf
in mfdata
, mfserv
to use this new plugin_wrapper
instead of custom codeplugin_env
in mfext
to use this new plugin_wrapper
instead of custom codeAnd last:
layerapi2
behaviour in mfext
to add lib
to PYTHONPATH
by defaultPYTHONPATH
by default in plugin_wrapper
maybe we can introduce some helpers and warning to avoid this
first ideas :
Thanks to @thebaptiste last change about layer packaging names, we can now think about better "metwork metapackage names".
Now:
yum install metwork-mfext
, we install (nearly) all mfext layers and the symbolic link /opt/metwork-mfext => /opt/metwork-mfext-trunk
(not really a big deal)yum install metwork-mfsysmon
(because of metwork-mfsysmon => metwork-mfext dependency), we install (nearly) all mfext layers (for this module, we need only little things)I think we can change this now to have (for example):
metwork-mfext-minimal-{BRANCH}
: install mfext minimal layers (root, core, default, python3_core, python3, python, default)metwork-mfext-minimal
: install mfext minimal layers + symbolic link /opt/metwork-mfext => /opt/metwork-mfext-{BRANCH}
metwork-mfext-full-{BRANCH}
: install all mfext layersmetwork-mfext-full
: install all mfext layers + symbolic link /opt/metwork-mfext => /opt/metwork-mfext-{BRANCH}
metwork-mfext
: alias of metwork-mfext-full
metwork-mfext-scientific-{BRANCH}
and metwork-mfext-scientific
(install scientific, python3_scientific)metwork-mfext-devtools-{BRANCH}
and metwork-mfext-devtools
(install python3_devtools, devtools, python3_devtools_jupyter, python3_scientific, scientific)And keep only these meta-packages.
Of course, we keep all newly created packages: metwork-mfext-layer-*
For other modules, we have to describe better the root dependencies. For example, the metwork-mfsysmon
package has a .layerapi2_dependencies
with only root@mfcom
. It should be:
And it should be automatically read in _metwork.spec
file.
So when we would do: yum install metwork-mfsysmon
, it would install:
and (because of dependencies):
metwork-mfext-layer-default
and metwork-mfext-layer-python
are missing in this idea. Maybe, we have to change something about them.
An idea would be:
default
into default@mfext
(in mfext) and we change its dependencies to limit them to mfext
componentsdefault@mfcom
layer with mfcom
components (FIXME: what about python
dependency => same idea) and default@mfext
default
into default@mfadmin
(in mfadmin) with mfadmin
components and default@mfcom
profile
to load default@mfext
and default@mfcom
(if relevant) and default@${MODULE_LOWERCASE}
(if relevant) instead of default
default@${MODULE_LOWERCASE}
layer for all modules with ${MODULE_LOWERCASE} components and default@mfcom
as dependencies :Do the same idea for python
layer (we want to drop it later but not now) :
python
into python@mfext
(in mfext) and we change its dependencies to limit them to mfext
componentspython@mfcom
layer with mfcom
components and python@mfext
python@${MODULE_LOWERCASE}
layer in each module having python2 or python3 layers with ${MODULE_LOWERCASE} python{METWORK_PYTHON_MODE} components and python@mfcom
as dependencies :{METWORK_PYTHON_MODE}
during rpm dependencies check (with the 3
hardcoded value).drone.yml and maybe other files
MODULE
MODULE_HOME
MODULE_LOWERCASE
MODULE_RUNTIME_SUFFIX
MODULE_RUNTIME_HOME
MODULE_RUNTIME_USER
MODULE_VERSION
maybe we should prefix them to get:
MFMODULE
MFMODULE_HOME
Because of metwork-framework/mfserv#281 and metwork-framework/mfext#571, we propose a big log refactoring for metwork.
Changes:
layer0_core
config.ini
config.ini
in new templated system: config.ini.custom
config.ini
in new templated system: config.ini.custom
config.ini
in new templated system: config.ini.custom
config.ini
in new templated system: config.ini.custom
config.ini
in new templated system: config.ini.custom
log_proxy
global configuration for all modules
std_redirect
in all circus
workers by corresponding tool in log_proxy
mfext
templatesmfserv
mfdata
mfbase
mfsysmon
mfadmin
log_proxy
usage in crontab for mfserv
log_proxy
usage in crontab for mfdata
log_proxy
usage in crontab for mfbase
log_proxy
usage in crontab for mfsysmon
log_proxy
usage in crontab for mfadmin
nginx
: https://stackoverflow.com/questions/22541333/have-nginx-access-log-and-error-log-log-to-stdout-and-stderr-of-master-process
logrotate copytruncate
featurehttps://github.com/metwork-framework/jsonsyslog2elasticsearch
in mfexthttps://github.com/metwork-framework/jsonlog2elasticsearch
from mfexthttps://github.com/metwork-framework/jsonlog2elasticsearch
mflog
to replace json_logs.log
feature with syslog to jsonsyslog2elasticsearch
nodejs
lib (work started in metwork-framework/mfserv#309)... since my commits with a very long comment
the current implementation of conf_monitor is written in bash and is cpu consumming
(because we build all configuration files at each iteration)
we should rewrite it better and in python
between "pr ready to merge" status, github branch protections and mergifybot
like the number of workers for example
MFMODULE != "MFEXT"
): export LOGPROXY_LOG_DIRECTORY="${MFMODULE_RUNTIME_HOME}/log"
All these points can be corrected by evolutions of _metwork.spec from mfext module
The first and third points should be corrected as follows :
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.