Python 3.11
You can also use pyenv for your python installations. Simply follow the instructions and set the global version to 3.11.
$ sudo apt-get install python3 build-essential python3-dev
$ brew install postgresql@14 libmagic openssl@3 openblas python
Please follow the instructions https://github.com/nvm-sh/nvm#installing-and-updating
We're using v20.0.0
(first version we install is the default)
$ nvm install 20.0.0
$ nvm use global 20.0.0
Please follow the instructions https://classic.yarnpkg.com/en/docs/install/#debian-stable
$ brew install yarn
install poetry
https://poetry.eustace.io/docs/
$ curl -sSL https://raw.githubusercontent.com/sdispater/poetry/master/get-poetry.py | python -
install pre-commit
https://pre-commit.com/
$ curl https://pre-commit.com/install-local.py | python -
And run
$ pre-commit install
Follow the guide https://docs.docker.com/compose/install/
Turn of the AirPlay Receiver
under System Preference -> Sharing -> AirPlay Receiver.
Otherwise, you will run into problems with port 5000 being already in use.
See this for more information.
Install Homebrew-file
https://homebrew-file.readthedocs.io/en/latest/installation.html
$ brew install rcmdnk/file/brew-file
And run
$ brew file install
This will prepare the whole inspire development with demo records:
make run
make setup
You can stop it by simply run
make stop
Alternatively you can follow the steps:
docker-compose up
docker-compose exec hep-web ./scripts/setup
docker-compose exec hep-web inspirehep importer demo-records
inspirehep should now be available under http://localhost:8080
$ cd backend
$ poetry install
$ cd ui
$ yarn install
$ cd record-editor
$ yarn install
First you need to start all the services (postgreSQL, Redis, ElasticSearch, RabbitMQ)
$ docker-compose -f docker-compose.services.yml up es mq db cache
And initialize database, ES, rabbitMQ, redis and s3
$ cd backend
$ ./scripts/setup
Note that s3 configuration requires default region to be set to us-east-1
. If you have another default setup in your AWS config (~/.aws/config
) you need to update it!
Also, to enable fulltext indexing & highlighting the following feature flags must be set to true:
FEATURE_FLAG_ENABLE_FULLTEXT = True
FEATURE_FLAG_ENABLE_FILES = True
You can visit Backend http://localhost:8000
$ cd backend
$ ./scripts/server
You can visit UI http://localhost:3000
$ cd ui
$ yarn start
$ cd ui
$ yarn start
You can also connect UI to another environment by changing the proxy in ui/setupProxy.js
proxy({
target: 'http://A_PROXY_SERVER',
...
});
The backend tests locally use testmon
to only run tests that depend on code that has changed (after the first run) by default:
$ cd backend
$ poetry run ./run-tests.sh
If you pass the --all
flag to the run-tests.sh
script, all tests will be run (this is equivalent to the --testmon-noselect
flag). All other flags passed to the script are transferred to py.test
, so you can do things like
$ poetry run ./run-tests.sh --pdb -k test_failing
You'll need to run all tests or force test selection (e.g. with -k
) in a few cases:
- an external dependency has changed, and you want to make sure that it doesn't break the tests (as
testmon
doesn't track external deps) - you manually change a test fixture in a non-python file (as
testmon
only tracks python imports, not external data)
If you want to invoke py.test
directly but still want to use testmon
, you'll need to use the --testmon --no-cov
flags:
$ poetry run py.test tests/integration/records --testmon --no-cov
If you want to disable testmon
test selection but still perform collection (to update test dependencies), use --testmon-noselect --no-cov
instead.
Note that testmon
is only used locally to speed up tests and not in the CI to be completely sure all tests pass before merging a commit.
If you wish to modify the SNow integration tests, you have to set the following variables in the SNow config file:
SNOW_CLIENT_ID
SNOW_CLIENT_SECRET
SNOW_AUTH_URL
The secrets can be found in the inspirehep QA or PROD sealed secrets. After setting the variables, run the tests, so the cassettes get generated.
Before you push dont forget to delete the secrets from the config file!
$ cd ui
$ yarn test # runs everything (lint, bundlesize etc.) indentical to CI
$ yarn test:unit # will open jest on watch mode
Note that jest
automatically run tests that changed files (unstaged) affect.
Runs everything from scratch, identical to CI
$ sh cypress-tests-chrome.sh
$ sh cypress-tests-firefox.sh
Opens cypress runner GUI runs them against local dev server (localhost:8080)
$ cd e2e
$ yarn test:dev
$ yarn test:dev --env inspirehep_url=<any url that serves inspirehep ui>
Visual tests are run only on headless
mode. So yarn test:dev
which uses the headed browser will ignore them.
Running existing visual tests and updating/creating snapshots requires cypress-tests.sh
script.
For continuous runs (when local DB is running and has required records etc.), the script can be reduced to only the last part sh cypress-tests-run.sh
.
If required, tests can run against localhost:3000
by simply modifying --host
option in sh cypress-tests-run.sh
.
You may not always need to run tests exactly like on the CI environment.
- To run specific suite, just change
test
script ine2e/package.json
temporarily tocypress run --spec cypress/integration/<spec.test.js>
First make sure that you are running:
$ cd backend
$ ./scripts/server
There is a command inspirehep importer records
which accepts url -u
, a directory of JSON
files -d
and JSON
files -f
.
A selection of demo records can be found in data
directory and they are structure based on the record type (i.e. literature
). Examples:
# Local
$ poetry run inspirehep importer records -u https://inspirehep.net/api/literature/20 -u https://inspirehep.net/api/literature/1726642
# Docker
$ docker-compose exec hep-web inspirehep importer records -u https://inspirehep.net/api/literature/20 -u https://inspirehep.net/api/literature/1726642
# `--save` will save the imported record also to the data folder
$ <...> inspirehep importer records -u https://inspirehep.net/api/literature/20 --save
Valid --token
or backend/inspirehep/config.py:AUTHENTICATION_TOKEN
is required.
# Local
$ poetry run inspirehep importer records -d data/records/literature
# Docker
$ docker-compose exec hep-web inspirehep importer records -d data/records/literature
# Local
$ poetry run inspirehep importer records -f data/records/literature/374836.json -f data/records/authors/999108.json
# Docker
$ docker-compose exec hep-web inspirehep importer records -f data/records/literature/374836.json -f data/records/authors/999108.json
# Local
$ poetry run inspirehep importer demo-records
# Docker
$ docker-compose exec hep-web inspirehep importer demo-records