Comments (8)
Thank you, I forwarded it internally and we are looking into it.
from pbbioconda.
We'll have a quick solution in the next pb-falcon update. You will set
export PYPEFLOW_PRE='source /home/user/myenv.sh'
E.g. that file can export
anything you need in your remote environment.
The specified command will be prepended to task.sh
and user_script.sh
, which are generally run on the remote machines (unless you run "locally").
(Someday, this will be available via fc_run.cfg
instead of via your shell environment, but that will require a larger change to FALCON/etc.)
Stay tuned for an update...
from pbbioconda.
I'm a little confused -- will this option not work if the job_type is set to 'local'? For testing, it's valuable to separate the control flow from the queueing process.
Thanks.
from pbbioconda.
It would work always, since it will get into the task.sh
wrapper, which is used with any process-watcher configuration.
I don't know when we will update pb-falcon in bioconda. Maybe a week? Until then, you can pull the develop
branch of pypeFLOW and try it.
from pbbioconda.
Btw, we still rely on some environment variables being passed through qsub
: PYPEFLOW_JOB_START_SCRIPT
and PYPEFLOW_JOB_START_TIMEOUT
. That's because we want to pass a pre-existing script to qsub
, not a generated script. I've seen qsub (or equivalent) fail on some users' systems when a generated script is not yet available.
But I suspect your environment is being passed through qsub. I think your problem is that your installation is not on a network disk:
+ /bin/bash task.sh
/db/congenomics/local6/binaries/python-2.7.10/bin/python2.7: No module named pypeflow
local6
? That sounds like a local disk.
If I'm right, then this new solution should work for you. The qsub script itself can be on a local disk because qsub actually copies it to the remote machine for us. E.g.
qsub -S /bin/bash -V foo.sh
In that case, /bin/bash
obviously needs to exist on the remote machine, but foo.sh
does not, since qsub will copy it. -V
ensures that our PYPEFLOW_*
environment variables will be available on the remote machine.
Does that make sense?
from pbbioconda.
export PATH=/db/congenomics/local6/binaries/conda/bin:$PATH
If you're sure that's the PATH needed remotely, then I do not understand how job_start.sh
is working for you. Possibly you are not using the default pwatcher_type=fs_based
. We really need Issue submitters to be more clear about their configuration.
And if job_start.sh
does not work for you, then we will have to think more deeply on how to satisfy both you and other users simultaneously, since filesystem latency is a real problem that other users have.
Job-distrubution is a very difficult problem. Satisfying the workflow graph is the only thing most authors of workflow systems seem to care about, but that is a trivial problem. I wish we could simply rely on a third-party job-distribution system which would satisfy all users, but I have not found something.
from pbbioconda.
I'm somewhat confused by the above discussion -- all I need is the ability to include a command at the beginning of any script generated by Pypeflow. Fundamentally, it's a string substitution. File system latency is not the issue -- rather, I need to set the environment because our qsub commands doesn't pass it in, even though Pypeflow specifies the copying of the environment. (qsub -A doesn't work in our environment.)
BTW, "local" refers to local to our environment, like /usr/local.
from pbbioconda.
export PYPEFLOW_PRE='source /my/bashrc'
from pbbioconda.
Related Issues (20)
- Demultiplexing using lima HOT 2
- CL: Annotation of alignment classes HOT 7
- The *.flnc_count.txt output from isoseq collapse does not generate columns for multiple samples
- lima ERROR: [pbcopper] alarm ERROR: cannot write to empty alarm filename HOT 11
- Error while running Falcon HOT 1
- Cannot download Iso-Seq example data HOT 1
- filter
- Lima failing to detect CCS data after Skera de-concatination & Bam2Fastq conversion
- pigeon classification doesn't work (Chinese Hamster) HOT 4
- Segmentation fault (core dumped) when installing ilma.
- isoseq correct is very sensitive for style of input
- Isoseq3 cluster2 exits without error message
- isoseq3 collapse stops with fatal error map::at
- Generate HiFi reads
- iso-seq reads and pbfusion HOT 2
- identify transcripts with polyA tails
- IsoPhase on MAS-Iso-seq data for haplotyping
- isoseq for scRNA: duplicate molecules are kept
- isoseq for scRNA: UMI with homopolymers are not removed
- What can I do only with .fastq reads?
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from pbbioconda.