GithubHelp home page GithubHelp logo

bulkvis's Introduction

⇜ bulkvis ⇝

An app written in Python3 using Bokeh to visualise raw squiggle data from Oxford Nanopore Technologies (ONT) bulkfiles.

Quickstart

Our preferred installation method uses conda with this environment setup:

name: bulkvis
channels:
  - bioconda
  - conda-forge
  - defaults
dependencies:
  - python=3.11
  - pip
  - pip:
    - git+https://github.com/LooseLab/[email protected]

Either copy the YAML above into a file or:

curl -O https://raw.githubusercontent.com/LooseLab/bulkvis/2.0/env.yml
conda env create -f env.yml

Then bulkvis can be started using:

conda activate bulkvis
bulkvis serve <BULK_FILE_DIRECTORY> --show
or with another python source
# Make a python3 virtual environment
python3 -m venv bulkvis

# Activate virtual environment
source bulkvis/bin/activate

# Clone the repo to your installation/projects directory
pip install git+https://github.com/LooseLab/[email protected]

# Start bokeh server
bulkvis serve <BULK_FILE_DIRECTORY> --show

Other install requires:

To open some bulk FAST5 files vbz compression plugins are required. These are written and maintained by Oxford Nanopore Technologies.

bulkvis's People

Contributors

alexomics avatar mattloose avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

bulkvis's Issues

Bulkvis installed with conda on mac

Hi,
after installing bulkvis as described here; I tried to generate a fuse file as follows:
bulkvis fuse -p minimap/test_0.25.paf.gz -s rawseq/sequencing_summary.txt.gz -o bulkvis/test_025
However, the error bellow is triggered, could you please help?

thank you

Traceback (most recent call last):
File "/Users/HAHasani/miniconda3/envs/bulkvis/lib/python3.7/site-packages/bulkvis/core.py", line 31, in concat_files_to_df
df_list.append(pd.read_csv(filepath_or_buffer=f, **kwargs))
File "/Users/HAHasani/miniconda3/envs/bulkvis/lib/python3.7/site-packages/pandas/io/parsers.py", line 676, in parser_f
return _read(filepath_or_buffer, kwds)
File "/Users/HAHasani/miniconda3/envs/bulkvis/lib/python3.7/site-packages/pandas/io/parsers.py", line 448, in _read
parser = TextFileReader(fp_or_buf, **kwds)
File "/Users/HAHasani/miniconda3/envs/bulkvis/lib/python3.7/site-packages/pandas/io/parsers.py", line 880, in init
self._make_engine(self.engine)
File "/Users/HAHasani/miniconda3/envs/bulkvis/lib/python3.7/site-packages/pandas/io/parsers.py", line 1114, in _make_engine
self._engine = CParserWrapper(self.f, **self.options)
File "/Users/HAHasani/miniconda3/envs/bulkvis/lib/python3.7/site-packages/pandas/io/parsers.py", line 1937, in init
_validate_usecols_names(usecols, self.orig_names)
File "/Users/HAHasani/miniconda3/envs/bulkvis/lib/python3.7/site-packages/pandas/io/parsers.py", line 1233, in _validate_usecols_names
"Usecols do not match columns, "
ValueError: Usecols do not match columns, columns expected but not found: ['filename']

Bokeh Application Handler Error

Hi there,

I am trying to get this to work on a Windows 10 machine. The error below persists using both Python 3.5 and Python 3.7. The Bokeh window opened in my browser is entirely blank. I'd be happy about any suggestions. Thanks!

===
(py37env) D:\Tools\Bioinformatics>bokeh serve --show bulkvis

2020-05-05 13:42:40,750 Starting Bokeh server version 0.12.15 (running on Tornado 5.0.1)
2020-05-05 13:42:40,750 Bokeh app running at: http://localhost:5006/bulkvis
2020-05-05 13:42:40,750 Starting Bokeh server with process id: 11364
2020-05-05 13:42:41,137 Error running application handler <bokeh.application.handlers.directory.DirectoryHandler object at 0x00000263371E0EF0>: 'plot_opts'
File "configparser.py", line 958, in getitem:
raise KeyError(key) Traceback (most recent call last):
File "d:\tools\bioinformatics\py37env\lib\site-packages\bokeh\application\handlers\code_runner.py", line 173, in run
exec(self._code, module.dict)
File "D:\Tools\Bioinformatics\bulkvis\main.py", line 19, in
cfg_po = config['plot_opts']
File "c:\python37\lib\configparser.py", line 958, in getitem
raise KeyError(key)
KeyError: 'plot_opts'

2020-05-05 13:42:41,153 200 GET /bulkvis (::1) 124.97ms
2020-05-05 13:42:41,388 101 GET /bulkvis/ws?bokeh-protocol-version=1.0&bokeh-session-id=vy5wnHDyUkL8mwnRHRzsDJXbl9HUHX9wwrHROmOe7DOK (::1) 0.00ms
2020-05-05 13:42:41,388 WebSocket connection opened
2020-05-05 13:42:41,388 ServerConnection created

===============================

Here's my list of packages installed.

(py37env) D:\Tools\Bioinformatics>pip freeze
bokeh==0.12.15
h5py==2.8.0
Jinja2==2.10
MarkupSafe==1.0
numpy==1.14.6
packaging==20.3
pandas==0.23.0
Pillow==5.1.0
pyparsing==2.4.7
python-dateutil==2.7.2
pytz==2018.3
PyYAML==3.12
selenium==3.11.0
six==1.11.0
tornado==5.0.1
tqdm==4.20.0

no files listed in browser

hi,

we just installed bulkvis and are trying to open the bulk fast5 files available at
https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/RCLFNB
(we mistakenly did not generate our own on our first minion runs.)

we get
Screenshot from 2020-12-28 12-01-30

the shell gives this output

(bulkvis) user@minion:~/LooseLab$ bokeh serve --show bulkvis
2020-12-28 12:04:12,597 Starting Bokeh server version 0.12.15 (running on Tornado 5.0.1)
2020-12-28 12:04:12,599 Bokeh app running at: http://localhost:5006/bulkvis
2020-12-28 12:04:12,599 Starting Bokeh server with process id: 23782
/home/user/envs/bulkvis/lib/python3.6/site-packages/h5py/init.py:36: FutureWarning: Conversion of the second argument of issubdtype from float to np.floating is deprecated. In future, it will be treated as np.float64 == np.dtype(float).type.
from ._conv import register_converters as _register_converters
2020-12-28 12:04:12,941 200 GET /bulkvis (127.0.0.1) 171.51ms
2020-12-28 12:04:12,991 200 GET /bulkvis (127.0.0.1) 30.83ms
2020-12-28 12:04:13,169 101 GET /bulkvis/ws?bokeh-protocol-version=1.0&bokeh-session-id=3GOSDJuGV6cax0hxw4N9JRQTQOLXVgQxY9J7sjpemi57 (127.0.0.1) 0.71ms
2020-12-28 12:04:13,169 WebSocket connection opened
2020-12-28 12:04:13,169 ServerConnection created

we'd appreciate your assistance!
roee amit lab

ERROR: Failed building wheel for numpy

Hello Bulkvis team,

I am inquiring for help in relation to an error that fails to build wheel for numpy.

Command lines:
python3 -m venv bulkvis
source bulkvis/bin/activate
pip3 install git+https://github.com/LooseLab/[email protected]

Error:
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for numpy
Failed to build numpy
ERROR: Could not build wheels for numpy, which is required to install pyproject.toml-based projects
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× pip subprocess to install build dependencies did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.

Would be great if assistance could be provided. Thank you!

Problem with installation

Hello,
I have a problem with installation on the stage of building dependencies.

After I put a command:
pip install -r bulkvis/requirements.txt
it writes with huge error:

ERROR: Command errored out with exit status 1:
command: /home/threadripper/bulkvis-env/bin/python3 /home/threadripper/bulkvis-env/lib/python3.7/site-packages/pip install --ignore-installed --no-user --prefix /tmp/pip-build-env-7n5ntt49/overlay --no-warn-script-location --no-binary :none: --only-binary :none: -i https://pypi.org/simple -- wheel setuptools Cython 'numpy==1.9.3; python_version=='"'"'3.5'"'"'' 'numpy==1.12.1; python_version=='"'"'3.6'"'"'' 'numpy==1.13.1; python_version>='"'"'3.7'"'"''
cwd: None
Complete output (3954 lines):
Ignoring numpy: markers 'python_version == "3.5"' don't match your environment
Ignoring numpy: markers 'python_version == "3.6"' don't match your environment
Collecting wheel
Using cached https://files.pythonhosted.org/packages/bb/10/44230dd6bf3563b8f227dbf344c908d412ad2ff48066476672f3a72e174e/wheel-0.33.4-py2.py3-none-any.whl
Collecting setuptools
Using cached https://files.pythonhosted.org/packages/ec/51/f45cea425fd5cb0b0380f5b0f048ebc1da5b417e48d304838c02d6288a1e/setuptools-41.0.1-py2.py3-none-any.whl
Collecting Cython
Using cached https://files.pythonhosted.org/packages/14/43/8cfcae48235d2553c55f1f8bec21b8f5ae21b617d7d0e59023ab4b0ddfaf/Cython-0.29.12-cp37-cp37m-manylinux1_x86_64.whl
Collecting numpy==1.13.1
Using cached https://files.pythonhosted.org/packages/c0/3a/40967d9f5675fbb097ffec170f59c2ba19fc96373e73ad47c2cae9a30aed/numpy-1.13.1.zip
Installing collected packages: wheel, setuptools, Cython, numpy
Running setup.py install for numpy: started
Running setup.py install for numpy: still running...
Running setup.py install for numpy: finished with status 'error'
ERROR: Command errored out with exit status 1:
command: /home/threadripper/bulkvis-env/bin/python3 -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-zs1v_8d8/numpy/setup.py'"'"'; file='"'"'/tmp/pip-install-zs1v_8d8/numpy/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(file);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' install --record /tmp/pip-record-y_k32ehd/install-record.txt --single-version-externally-managed --prefix /tmp/pip-build-env-7n5ntt49/overlay --compile --install-headers /home/threadripper/bulkvis-env/include/site/python3.7/numpy
cwd: /tmp/pip-install-zs1v_8d8/numpy/
Complete output (3934 lines):
Running from numpy source directory.

ERROR: Command errored out with exit status 1: /home/threadripper/bulkvis-env/bin/python3 /home/threadripper/bulkvis-env/lib/python3.7/site-packages/pip install --ignore-installed --no-user --prefix /tmp/pip-build-env-7n5ntt49/overlay --no-warn-script-location --no-binary :none: --only-binary :none: -i https://pypi.org/simple -- wheel setuptools Cython 'numpy==1.9.3; python_version=='"'"'3.5'"'"'' 'numpy==1.12.1; python_version=='"'"'3.6'"'"'' 'numpy==1.13.1; python_version>='"'"'3.7'"'"'' Check the logs for full command output.

Issue whith whale_watch.py

I'm trying to use the whale_watch.py. I got the following error mesage when running it:

File "/kingdoms/rce/workspace8/Tranber/bulkvis-master/utils/whale_watch.py", line 294, in
main()
File "/kingdoms/rce/workspace8/Tranber/bulkvis-master/utils/whale_watch.py", line 11, in main
fused_df, ss, fused_read_ids = fuse_reads(args.summary, args.paf, args.distance, args.top, args.alt, args.debug)
File "/kingdoms/rce/workspace8/Tranber/bulkvis-master/utils/whale_watch.py", line 106, in fuse_reads
df2['cat_read_id'] = df2['all_but_last'] + "|" + df2['last_read_id']
File "/usr/local/lib/python3.5/dist-packages/pandas/core/ops.py", line 1069, in wrapper
result = safe_na_op(lvalues, rvalues)
File "/usr/local/lib/python3.5/dist-packages/pandas/core/ops.py", line 1033, in safe_na_op
return na_op(lvalues, rvalues)
File "/usr/local/lib/python3.5/dist-packages/pandas/core/ops.py", line 1023, in na_op
result[mask] = op(x[mask], y)
numpy.core._exceptions.UFuncTypeError: ufunc 'add' did not contain a loop with signature matching types (dtype('<U32'), dtype('<U32')) -> dtype('<U32')

If I understand properly this line causes the problem : df2['cat_read_id'] = df2['all_but_last'] + "|" + df2['last_read_id']. But I don't find a way to make it work. Can it be that my summary file is from a multiplexed run ?

Bulkvis with conda

Hi,

I followed the installation steps for conda as discribed here; I have two basic questions please:

  1. Do I still need to setup the config file?
  2. I enabled bulk fast5 generation, yet my run contains the usual folders (fastq & fast5 with [pass,fail] each ). Since I never worked with bulk files before I don't know which extention they have and where to find them?

Could you please help?
Thank you

"TypeError: data type 'Float64' not understood" Error in Export

I encountered an error when exporting "Export data" button in the UI, where the export would fail with the error:

(bulkvis) shorovatin@TARAS:~/assets$ bulkvis serve /home/shorovatin/assets/data --show
2022-04-29 11:44:13,365 Starting Bokeh server version 2.1.1 (running on Tornado 6.0.4)
2022-04-29 11:44:13,366 User authentication hooks NOT provided (default user enabled)
2022-04-29 11:44:13,367 Bokeh app running at: http://localhost:5006/bulkvis_server
2022-04-29 11:44:13,367 Starting Bokeh server with process id: 276595
2022-04-29 11:44:13,604 Using dir: /home/shorovatin/assets/data
2022-04-29 11:44:13,709 404 GET /favicon.ico (127.0.0.1) 0.36ms
2022-04-29 11:44:13,784 WebSocket connection opened
2022-04-29 11:44:13,784 ServerConnection created
2022-04-29 11:44:26,522 Exporting to /home/shorovatin/assets/data/WIMMER_20220224_1124_FAR92883_MN23001_sequencing_run_HMW1_2210210f_bulkvis-read_0-36000000_ch_390.fast5
2022-04-29 11:44:26,524 error handling message
 message: Message 'PATCH-DOC' content: {'events': [{'kind': 'MessageSent', 'msg_type': 'bokeh_event', 'msg_data': {'event_name': 'button_click', 'event_values': {}}}], 'references': []} 
 error: TypeError("data type 'Float64' not understood")
Traceback (most recent call last):
  File "/home/shorovatin/miniconda3/envs/bulkvis/lib/python3.7/site-packages/bokeh/server/protocol_handler.py", line 90, in handle
    work = await handler(message, connection)
  File "/home/shorovatin/miniconda3/envs/bulkvis/lib/python3.7/site-packages/bokeh/server/session.py", line 67, in _needs_document_lock_wrapper
    result = func(self, *args, **kwargs)
  File "/home/shorovatin/miniconda3/envs/bulkvis/lib/python3.7/site-packages/bokeh/server/session.py", line 261, in _handle_patch
    message.apply_to_document(self.document, self)
  File "/home/shorovatin/miniconda3/envs/bulkvis/lib/python3.7/site-packages/bokeh/protocol/messages/patch_doc.py", line 100, in apply_to_document
    doc._with_self_as_curdoc(lambda: doc.apply_json_patch(self.content, setter))
  File "/home/shorovatin/miniconda3/envs/bulkvis/lib/python3.7/site-packages/bokeh/document/document.py", line 1150, in _with_self_as_curdoc
    return f()
  File "/home/shorovatin/miniconda3/envs/bulkvis/lib/python3.7/site-packages/bokeh/protocol/messages/patch_doc.py", line 100, in <lambda>
    doc._with_self_as_curdoc(lambda: doc.apply_json_patch(self.content, setter))
  File "/home/shorovatin/miniconda3/envs/bulkvis/lib/python3.7/site-packages/bokeh/document/document.py", line 398, in apply_json_patch
    self._trigger_on_message(event_json["msg_type"], event_json["msg_data"])
  File "/home/shorovatin/miniconda3/envs/bulkvis/lib/python3.7/site-packages/bokeh/document/document.py", line 687, in _trigger_on_message
    cb(msg_data)
  File "/home/shorovatin/miniconda3/envs/bulkvis/lib/python3.7/site-packages/bokeh/document/document.py", line 354, in apply_json_event
    model._trigger_event(event)
  File "/home/shorovatin/miniconda3/envs/bulkvis/lib/python3.7/site-packages/bokeh/util/callback_manager.py", line 85, in _trigger_event
    self._document._with_self_as_curdoc(invoke)
  File "/home/shorovatin/miniconda3/envs/bulkvis/lib/python3.7/site-packages/bokeh/document/document.py", line 1150, in _with_self_as_curdoc
    return f()
  File "/home/shorovatin/miniconda3/envs/bulkvis/lib/python3.7/site-packages/bokeh/util/callback_manager.py", line 72, in invoke
    callback()
  File "/home/shorovatin/miniconda3/envs/bulkvis/lib/python3.7/site-packages/bulkvis/bulkvis_server/main.py", line 1084, in export_data
    cfg_dr["out"],
  File "/home/shorovatin/miniconda3/envs/bulkvis/lib/python3.7/site-packages/bulkvis/bulkvis_server/main.py", line 82, in export_read_file
    dtype="Float64",
  File "/home/shorovatin/miniconda3/envs/bulkvis/lib/python3.7/site-packages/h5py/_hl/attrs.py", line 149, in create
    dtype = numpy.dtype(dtype) # In case a string, e.g. 'i8' is passed
TypeError: data type 'Float64' not understood

This appears to be an incompatibility with h5py's named type Float64 and numpy's float64 type. Upon editing all instances of Float64 -> float64 within /bulkvis_server/main.py, the error resolves, with bulkvis being able to export the selected region to a fast5 as expected.

I am creating this issue to document a potential solution to the above error. Any feedback would be appreciated.

I am running on Ubuntu 20.04, using a conda environment outlined in provided yaml in the 2.0 branch.

issue with whale_merge.py (file format ?)

I'm now trying whale_merge as whale_watch is working.
Our fastq id string as this format :

@ch134_read207_45c3878c-f8ae-4aee-8e48-93e8f8e1e332_pass_FAL84658

It is different from the format for which the script is supposed to work.

I have modified a part of the whale_watch.py script in order to make it work with our fastq format

                while line:
                    if len(line.split('_')) > 1: #means we have a fastq ID line
                        if cnt%400 == 0:
                            print(myreadtracker.result())
                        read_id = line.split('_')[2]

However it doesn't produce the new fastq at the end, and I don't get any error messages. I'm not really familiar with python, am I missing something to adapt the script ?

bulk fast5 file

Hello,

thank you for developing this game-changing tool!

One question regarding the .fast5 files. If I did not switch on the "bulk fast5 option" before starting the sequencing run, can I still generate the bulk file from individual .fast5 files by concatenating the individual files?

Thanks!

Overlapping text display on the browser

Hello maintainers,

Thanks for developing BulkVis! I'm trying to use it to visualize the NA12878 direct RNA Notts Run 1 data, and the display looks messed on my browser. I'm wondering what can be done? I tried both Firefox and Chrome, and they are behaving the same. Here attached the screenshot.

Screenshot from 2020-01-31 17-07-58

Thanks,
Chen

Installation

Hello,

Can you help me with the installation?

cd bulkvis
python utils/set_confi.py -b <> -i /path/to/bulkfile/directory -e /path/to/readfile/directory -m /path/to/mapfile/directory -c config.ini

On this step:
-i is the path to my folder which I clone?
-e: readfile?
-m: mapfile?

Thank you!

argparse needs special handling of bokeh args for bokeh 2.4.3 or later

When building with a slightly more recent version of bokeh (2.4.3 instead of 2.1.0) bulkvis.py needs to handle the Serve.args specially, e.g. with code like this:

from bokeh.command.subcommand import Argument
from bokeh.util.dataclasses import entries

...

    if isinstance(opts, Argument):
      opts = dict(entries(opts))

This way the use of add_argument will work even though the Serve.args are wrapped in bokeh's Argument class.

whale_merge.py

Dear bulkvis team,

I'm trying to use your program to have the corrected reads for over-splitting which seems to be a common feature in my data. I'm trying to use whale_merge.py to get the fastq file for downstream analysis the program seems to run well and identify reads and even concatenade them but at the end I the fastq input is completely empty. I even try to adjust the -d 50000 parameter to several more and less cutoff and still don't get any. This is the code I use:

`python3 whale_merge.py -s
Original read count: 2189756
Un-fused reads: 2162765
Reads joined: 26991
Fused reads: 13378
New read count: 2176143
Total bases: 8986185817

             MIN    MAX  MEAN   N50

Original: 4 96900 4103 4341
Un-fused reads: 4 96900 4123 4346
To be fused: 124 12405 2557 3488
Fused read: 466 19120 5159 5336
New: 4 96900 4129 4350
Top 10 original reads by length:
1: 96900
2: 90594
3: 51861
4: 51510
5: 51501
6: 41520
7: 30328
8: 29367
9: 28218
10: 25478
Top 10 fused reads by combined length:
1: 19120
2: 18797
3: 18126
4: 17997
5: 17600
6: 16815
7: 16651
8: 16238
9: 16106
10: 16082
Top 10 reads after correction:
1: 96900
2: 90594
3: 51861
4: 51510
5: 51501
6: 41520
7: 30328
8: 29367
9: 28218
10: 25478
2189756 reads to process.
!!!!!!! 0
2189756 to process, 0 fused, 0 unfused, 0 written.

Thank you very much!

Location of NC_sequencing.py

I recently updated my version of Minknow and now cannot locate the necessary py script to turn on the saving of bulk files. Is there a way to locate this script and/or which folders should I be looking in?

Thanks for your help

No bulk fast5 shown in web browser

Hi,
I successfully downloaded bulkvis and when I want to start with command as following nothing shown on the web page:
$ bokeh serve --show bulkvis-master

image

The config.init is described as :
[data]
dir = /Share2/home/lifj/Projects/12.Nanopore/Zhiyuan-20201104/Data/20201102_XRNAX_test_1_RNA002_85ng/no_sample/20201102_1331_MN32106_FAL
58344_8d29359e/fast5_pass/
out = /Share2/home/lifj/Projects/12.Nanopore/Zhiyuan-20201104/01.base_calling/fastq/
map = /Share2/home/lifj/Projects/12.Nanopore/Zhiyuan-20201104/06.minimap2/withSecondaryAlignment/sam/
[plot_opts]
wdg_width = 300
plot_width = 1600
plot_height = 1000
y_min = 0
y_max = 2200
label_height = 800
upper_cut_off = 2200
lower_cut_off = -1000
;recommended that this is kept as 'canvas' opts: ['canvas', 'svg', 'webgl']
output_backend = canvas
[labels]
adapter = True
mux_uncertain = True
strand1 = False
user1 = True
event = False
user2 = False
multiple = False
unclassed = False
pore = True
strand = True
transition = True
unavailable = False
zero = False
saturated = False
unblocking = True
good_single = False
below = False
above = True
inrange = False
unclassified = False
unclassified_following_reset = False
pending_manual_reset = False
pending_mux_change = False]

I have no idea what is going on and totally be stucked. Could you help me to solve this problem?

unable to connect to dbus

Hiya. Keen on using bulkvis.
Managed to get some of the way through the install.

Q1: does a 'bulkfile' = a fast5 file that contains other, individual fast5 files?
Q2: did basecalling need to be enabled? Can I just look at squiggles?

Here's a screenshot of something happening.....

Screenshot at 2020-01-16 12-15-31

... but logging to console says the following.....

~/.virtualenvs/bulkvis/bulkvis$ bokeh serve --show ~/.virtualenvs/bulkvis/bulkvis
2020-01-16 12:12:17,847 Starting Bokeh server version 0.12.15 (running on Tornado 5.0.1)
2020-01-16 12:12:17,849 Bokeh app running at: http://localhost:5006/bulkvis
2020-01-16 12:12:17,849 Starting Bokeh server with process id: 6963

** (/usr/lib/firefox/firefox:7105): WARNING **: 12:12:18.435: Unable to connect to dbus: Could not connect: Connection refused
[2020-01-16T12:12:18Z ERROR audio_thread_priority::rt_linux] setrlimit64: 1
/home/mike.harbour/.virtualenvs/bulkvis/lib/python3.6/site-packages/h5py/init.py:36: FutureWarning: Conversion of the second argument of issubdtype from float to np.floating is deprecated. In future, it will be treated as np.float64 == np.dtype(float).type.
from ._conv import register_converters as _register_converters

** (/usr/lib/firefox/firefox:7150): WARNING **: 12:12:18.903: Unable to connect to dbus: Could not connect: Connection refused
2020-01-16 12:12:18,908 200 GET /bulkvis (127.0.0.1) 177.64ms
2020-01-16 12:12:19,112 200 GET /static/css/bokeh.min.css?v=f7fe43f4980a82921415acbd60d5fd66 (127.0.0.1) 7.16ms
2020-01-16 12:12:19,316 101 GET /bulkvis/ws?bokeh-protocol-version=1.0&bokeh-session-id=qV2dJR0Tsr3asgoqy4jSoZ3X7jM3tLA69dQJIefqqZVN (127.0.0.1) 1.10ms
2020-01-16 12:12:19,316 WebSocket connection opened
2020-01-16 12:12:19,316 ServerConnection created

** (/usr/lib/firefox/firefox:7213): WARNING **: 12:12:19.502: Unable to connect to dbus: Could not connect: Connection refused

....symptom is, no bulkfiles seem to be available to choose.
I put a multi-fast5 file into the folder stated to be the 'bulkfile directory' in my config.ini.

Any help gratefull received
Mike

create bulk FAST5 file

Hi Alexomics,

I checked the bulk raw output items, however, still could not achieve analyzable bulk FAST5?
Could you tell me how to achieve analyzable bulk FAST5?
Thank you in advance.

Originally posted by @liu-pinnng in #30 (comment)

whale_merge.py sequence naming

Hi there,

I'm not sure if others have had this issue, but the naming convention of fused reads in whale_merge.py can be problematic for htslib/samtools.

Samtools throws the error:

[E::sam_parse1] query name too long
[W::sam_read1] Parse error at line 281431
[main_samview] truncated file.

This also affects other tools that rely on samtools.

This seems to occur when multiple reads are fused together and are named with XX|YY|ZZ... and the length of the query exceeds 251 characters as described here: pysam-developers/pysam#447

A possible solution would be to slice the FASTQ name to 250 characters or fewer.

Thanks,

dave

bokeh KeyError 'plot_opts"

Hello,

I had problems with matching distribution about MarkupSafe so I installed every requirements manually.
Then, I executed following command
bokeh serve --show bulkvis

Then, I have errors.
2021-05-11 12:06:40,375 Starting Bokeh server version 2.3.2 (running on Tornado 6.1)
2021-05-11 12:06:40,376 User authentication hooks NOT provided (default user enabled)
2021-05-11 12:06:40,377 Bokeh app running at: http://localhost:5006/bulkvis
2021-05-11 12:06:40,377 Starting Bokeh server with process id: 3152
2021-05-11 12:06:40,599 Error running application handler <bokeh.application.handlers.directory.DirectoryHandler object at 0x7ff5c5ce0c70>: 'plot_opts'
File "configparser.py", line 960, in getitem:
raise KeyError(key) Traceback (most recent call last):
File "/home/sumin/envs/bulkvis/lib/python3.8/site-packages/bokeh/application/handlers/code_runner.py", line 197, in run
exec(self._code, module.dict)
File "/home/sumin/bulkvis/main.py", line 19, in
cfg_po = config['plot_opts']
File "/home/sumin/miniconda3/lib/python3.8/configparser.py", line 960, in getitem
raise KeyError(key)
KeyError: 'plot_opts'

2021-05-11 12:06:40,706 WebSocket connection opened
2021-05-11 12:06:40,706 ServerConnection created

Somehow the bulkvis webpage is opened but I can't see anything where I can select my fast5 files.
Screenshot_2021-05-11_12-11-20

Update whale-watch.py

I think recent file format changes can cause whale-watch.py to fail. I was able to fix with two tweaks:

Line 30: change 'fastq' to 'filename_fastq'

Line 52: add 'skiprows=6,' to skip header in PAF file.

Cheers,

dave

"0 files written" with gen_bmf.py

Hello,

I was using BulkVis to look at raw signals and wanted to align them to the reference sequence. I saw you comment on Issue#34 indicating the script gen_bmf.py could be utilized.

when I ran this:

(bulkvis) sanjin@LAPTOP-6ESGOMF6:~/bulkvis$ python3 gen_bmf.py -s /mnt/f/D1_sample/sequencing_summary.txt -p /mnt/f/D1_sample/filt_sample.paf --bmf /mnt/f/D1_sample/

I got: "0 files written to /mnt/f/D1_sample"

I noticed the help menu says "a sequencing summary file generated by albacore" and was wondering if this could be this issue since I used a sequencing_summary.txt from Guppy's output.

I also noticed the previous BulkVis required the config.ini file and that the 2.0 no longer does... however, for running gen_bmf.py it says "Specify the output folder, where files will be written as <run_id>.bmf. This should be the 'map' path specified in the config.ini"
If I never made a config.ini file in the 2.0 installation, and cannot find config.ini file in bulkvis or any other folder, does it still matter where i point the path to for the -bmf option?

Thanks for the resource and help, I appreciate all the work that has gone into BulkVis!

best,
S

No file could be selected in the web browser

Hi I was running :

(bulkvis) PS C:\WINDOWS\system32> bulkvis serve "C:\D\backup\fast5\barcode01" --show
2023-11-28 12:27:09,066 Starting Bokeh server version 2.3.3 (running on Tornado 6.3.3)
2023-11-28 12:27:09,068 User authentication hooks NOT provided (default user enabled)
2023-11-28 12:27:09,074 Bokeh app running at: http://localhost:5006/bulkvis_server
2023-11-28 12:27:09,074 Starting Bokeh server with process id: 12240
2023-11-28 12:27:09,333 Using dir: C:\D\backup\fast5\barcode01
2023-11-28 12:27:09,456 WebSocket connection opened
2023-11-28 12:27:09,457 ServerConnection created

And the web browser popped up but no file could be selected:
screenshot20231128

bulk issues

Hello,
In defining the location of the bulk file, it says it doesn't exist (when it does) and I was wondering how to resolve this or the reason behind it. I am unable to open any files in the viewer.
OSError: Unable to open file (unable to open file: name = 'FAL14239_eb9fb0f7e0920ba19138e7f6598e04c2da9b8e82_96.fast5', errno = 2, error message = 'No such file or directory', flags = 0, o_flags = 0)

Bulk FAST5 used in BulkVis Bioinformatics publication

This is a question,

I'm trying to launch the browser and view the signal data as you do in Figure 2 of BulkVis Paper. Here's what I do:

  1. Download the supplementary data you provide HERE
  2. Install BulkVis as specified in the installation.
  3. Attempt to configure for Collection001 data set.
  • I run into an issue when attempting to run set_config.py, I don't know where the bulkfile from the original MinION run is.

Is it possible to get access to the bulkfile so I'm able to properly configure BulkVis?

Installation problems

hi,

I am trying to install numpy on window python 3.4.4
but the error occurred, i even reinstall the whole python and assign the path to the administration system . Kindly anyone of you guides me or help me, how to install. If the error is,
error: Microsoft Visual C++ 10.0 is required. Get it with "Microsoft Windows S
DK 7.1": www.microsoft.com/download/details.aspx?id=8279
ERROR: Failed building wheel for numpy
Running setup.py clean for numpy
ERROR: Command errored out with exit status 1:
command: 'c:\python34\python.exe' -u -c 'import sys, setuptools, tokenize; sy
s.argv[0] = '"'"'C:\Users\an95120\AppData\Local\Temp\pip-install-xuqde9e1
\numpy\setup.py'"'"'; file='"'"'C:\Users\an95120\AppData\Local\Temp\p
ip-install-xuqde9e1\numpy\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', op
en)(file);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec
(compile(code, file, '"'"'exec'"'"'))' clean --all
cwd: C:\Users\an95120\AppData\Local\Temp\pip-install-xuqde9e1\numpy
Complete output (10 lines):
Running from numpy source directory.

setup.py clean is not supported, use one of the following instead:

- `git clean -xdf` (cleans all files)
- `git clean -Xdf` (cleans all versioned files, doesn't touch
                    files that aren't checked into the git repo)

Add --force to your command to use it anyway if you must (unsupported).

ERROR: Failed cleaning build dir for numpy
Failed to build numpy
Installing collected packages: numpy
Running setup.py install for numpy ... error
ERROR: Command errored out with exit status 1:
command: 'c:\python34\python.exe' -u -c 'import sys, setuptools, tokenize;
sys.argv[0] = '"'"'C:\Users\an95120\AppData\Local\Temp\pip-install-xuqde9e
1\numpy\setup.py'"'"'; file='"'"'C:\Users\an95120\AppData\Local\Temp
\pip-install-xuqde9e1\numpy\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"',
open)(file);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();ex
ec(compile(code, file, '"'"'exec'"'"'))' install --record 'C:\Users\an95120
AppData\Local\Temp\pip-record-41uj2idn\install-record.txt' --single-version-exte
rnally-managed --compile
cwd: C:\Users\an95120\AppData\Local\Temp\pip-install-xuqde9e1\numpy
Complete output (226 lines):
Running from numpy source directory.

Note: if you need reliable uninstall behavior, then install
with pip instead of using `setup.py install`:

  - `pip install .`       (from a git repo or downloaded source
                           release)
  - `pip install numpy`   (last NumPy release on PyPi)


blas_opt_info:
blas_mkl_info:
customize MSVCCompiler
  libraries mkl_rt not found in ['c:\\python34\\lib', 'C:\\', 'c:\\python34\

\libs']
NOT AVAILABLE

blis_info:
customize MSVCCompiler
  libraries blis not found in ['c:\\python34\\lib', 'C:\\', 'c:\\python34\\l

ibs']
NOT AVAILABLE

openblas_info:
customize MSVCCompiler
customize MSVCCompiler
  libraries openblas not found in ['c:\\python34\\lib', 'C:\\', 'c:\\python3

4\libs']
get_default_fcompiler: matching types: '['gnu', 'intelv', 'absoft', 'compaqv
', 'intelev', 'gnu95', 'g95', 'intelvem', 'intelem', 'flang']'
customize GnuFCompiler
Could not locate executable g77
Could not locate executable f77
customize IntelVisualFCompiler
Could not locate executable ifort
Could not locate executable ifl
customize AbsoftFCompiler
Could not locate executable f90
customize CompaqVisualFCompiler
Could not locate executable DF
customize IntelItaniumVisualFCompiler
Could not locate executable efl
customize Gnu95FCompiler
Could not locate executable gfortran
Could not locate executable f95
customize G95FCompiler
Could not locate executable g95
customize IntelEM64VisualFCompiler
customize IntelEM64TFCompiler
Could not locate executable efort
Could not locate executable efc
customize PGroupFlangCompiler
don't know how to compile Fortran code on platform 'nt'
NOT AVAILABLE

atlas_3_10_blas_threads_info:
Setting PTATLAS=ATLAS
customize MSVCCompiler
  libraries tatlas not found in ['c:\\python34\\lib', 'C:\\', 'c:\\python34\

\libs']
NOT AVAILABLE

atlas_3_10_blas_info:
customize MSVCCompiler
  libraries satlas not found in ['c:\\python34\\lib', 'C:\\', 'c:\\python34\

\libs']
NOT AVAILABLE

atlas_blas_threads_info:
Setting PTATLAS=ATLAS
customize MSVCCompiler
  libraries ptf77blas,ptcblas,atlas not found in ['c:\\python34\\lib', 'C:\\

', 'c:\python34\libs']
NOT AVAILABLE

atlas_blas_info:
customize MSVCCompiler
  libraries f77blas,cblas,atlas not found in ['c:\\python34\\lib', 'C:\\', '

c:\python34\libs']
NOT AVAILABLE

accelerate_info:
  NOT AVAILABLE

C:\Users\an95120\AppData\Local\Temp\pip-install-xuqde9e1\numpy\numpy\distuti

ls\system_info.py:639: UserWarning:
Atlas (http://math-atlas.sourceforge.net/) libraries not found.
Directories to search for the libraries can be specified in the
numpy/distutils/site.cfg file (section [atlas]) or by setting
the ATLAS environment variable.
self.calc_info()
blas_info:
customize MSVCCompiler
libraries blas not found in ['c:\python34\lib', 'C:', 'c:\python34\l
ibs']
NOT AVAILABLE

C:\Users\an95120\AppData\Local\Temp\pip-install-xuqde9e1\numpy\numpy\distuti

ls\system_info.py:639: UserWarning:
Blas (http://www.netlib.org/blas/) libraries not found.
Directories to search for the libraries can be specified in the
numpy/distutils/site.cfg file (section [blas]) or by setting
the BLAS environment variable.
self.calc_info()
blas_src_info:
NOT AVAILABLE

C:\Users\an95120\AppData\Local\Temp\pip-install-xuqde9e1\numpy\numpy\distuti

ls\system_info.py:639: UserWarning:
Blas (http://www.netlib.org/blas/) sources not found.
Directories to search for the sources can be specified in the
numpy/distutils/site.cfg file (section [blas_src]) or by setting
the BLAS_SRC environment variable.
self.calc_info()
NOT AVAILABLE

'svnversion' is not recognized as an internal or external command,
operable program or batch file.
non-existing path in 'numpy\\distutils': 'site.cfg'
lapack_opt_info:
lapack_mkl_info:
customize MSVCCompiler
  libraries mkl_rt not found in ['c:\\python34\\lib', 'C:\\', 'c:\\python34\

\libs']
NOT AVAILABLE

openblas_lapack_info:
customize MSVCCompiler
customize MSVCCompiler
  libraries openblas not found in ['c:\\python34\\lib', 'C:\\', 'c:\\python3

4\libs']
NOT AVAILABLE

openblas_clapack_info:
customize MSVCCompiler
customize MSVCCompiler
  libraries openblas,lapack not found in ['c:\\python34\\lib', 'C:\\', 'c:\\

python34\libs']
NOT AVAILABLE

atlas_3_10_threads_info:
Setting PTATLAS=ATLAS
customize MSVCCompiler
  libraries lapack_atlas not found in c:\python34\lib
customize MSVCCompiler
  libraries tatlas,tatlas not found in c:\python34\lib
customize MSVCCompiler
  libraries lapack_atlas not found in C:\
customize MSVCCompiler
  libraries tatlas,tatlas not found in C:\
customize MSVCCompiler
  libraries lapack_atlas not found in c:\python34\libs
customize MSVCCompiler
  libraries tatlas,tatlas not found in c:\python34\libs
<class 'numpy.distutils.system_info.atlas_3_10_threads_info'>
  NOT AVAILABLE

atlas_3_10_info:
customize MSVCCompiler
  libraries lapack_atlas not found in c:\python34\lib
customize MSVCCompiler
  libraries satlas,satlas not found in c:\python34\lib
customize MSVCCompiler
  libraries lapack_atlas not found in C:\
customize MSVCCompiler
  libraries satlas,satlas not found in C:\
customize MSVCCompiler
  libraries lapack_atlas not found in c:\python34\libs
customize MSVCCompiler
  libraries satlas,satlas not found in c:\python34\libs
<class 'numpy.distutils.system_info.atlas_3_10_info'>
  NOT AVAILABLE

atlas_threads_info:
Setting PTATLAS=ATLAS
customize MSVCCompiler
  libraries lapack_atlas not found in c:\python34\lib
customize MSVCCompiler
  libraries ptf77blas,ptcblas,atlas not found in c:\python34\lib
customize MSVCCompiler
  libraries lapack_atlas not found in C:\
customize MSVCCompiler
  libraries ptf77blas,ptcblas,atlas not found in C:\
customize MSVCCompiler
  libraries lapack_atlas not found in c:\python34\libs
customize MSVCCompiler
  libraries ptf77blas,ptcblas,atlas not found in c:\python34\libs
<class 'numpy.distutils.system_info.atlas_threads_info'>
  NOT AVAILABLE

atlas_info:
customize MSVCCompiler
  libraries lapack_atlas not found in c:\python34\lib
customize MSVCCompiler
  libraries f77blas,cblas,atlas not found in c:\python34\lib
customize MSVCCompiler
  libraries lapack_atlas not found in C:\
customize MSVCCompiler
  libraries f77blas,cblas,atlas not found in C:\
customize MSVCCompiler
  libraries lapack_atlas not found in c:\python34\libs
customize MSVCCompiler
  libraries f77blas,cblas,atlas not found in c:\python34\libs
<class 'numpy.distutils.system_info.atlas_info'>
  NOT AVAILABLE

lapack_info:
customize MSVCCompiler
  libraries lapack not found in ['c:\\python34\\lib', 'C:\\', 'c:\\python34\

\libs']
NOT AVAILABLE

C:\Users\an95120\AppData\Local\Temp\pip-install-xuqde9e1\numpy\numpy\distuti

ls\system_info.py:639: UserWarning:
Lapack (http://www.netlib.org/lapack/) libraries not found.
Directories to search for the libraries can be specified in the
numpy/distutils/site.cfg file (section [lapack]) or by setting
the LAPACK environment variable.
self.calc_info()
lapack_src_info:
NOT AVAILABLE

C:\Users\an95120\AppData\Local\Temp\pip-install-xuqde9e1\numpy\numpy\distuti

ls\system_info.py:639: UserWarning:
Lapack (http://www.netlib.org/lapack/) sources not found.
Directories to search for the sources can be specified in the
numpy/distutils/site.cfg file (section [lapack_src]) or by setting
the LAPACK_SRC environment variable.
self.calc_info()
NOT AVAILABLE

c:\python34\lib\distutils\dist.py:260: UserWarning: Unknown distribution opt

ion: 'define_macros'
warnings.warn(msg)
running install
running build
running config_cc
unifing config_cc, config, build_clib, build_ext, build commands --compiler
options
running config_fc
unifing config_fc, config, build_clib, build_ext, build commands --fcompiler
options
running build_src
build_src
building py_modules sources
building library "npymath" sources
error: Microsoft Visual C++ 10.0 is required. Get it with "Microsoft Windows
SDK 7.1": www.microsoft.com/download/details.aspx?id=8279

ERROR: Command errored out with exit status 1: 'c:\python34\python.exe' -u -c 'i
mport sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\Users\an95120\AppData
\Local\Temp\pip-install-xuqde9e1\numpy\setup.py'"'"'; file='"'"'C:\User
s\an95120\AppData\Local\Temp\pip-install-xuqde9e1\numpy\setup.py'"'"';f=g
etattr(tokenize, '"'"'open'"'"', open)(file);code=f.read().replace('"'"'\r\n
'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' in
stall --record 'C:\Users\an95120\AppData\Local\Temp\pip-record-41uj2idn\install-
record.txt' --single-version-externally-managed --compile Check the logs for ful
l command output.

Hello, I am having the same issue. Did you manage to fix this?

Originally posted by @Cerandior in #27 (comment)

How can I check I have the correct bulk-fast5 file?

I did not do that sequencing run first-hand. Is there anyway I can check I have the right bulk-fast5 files to use with bulkvis?

For example, one of my file:

filename = AGQ569_pass_barcode12_3dd4cb83_0.fast5
size = 62.3 Mb

Does that sound plausible to you?

Given the size I'm pretty sure it's not single-read (it was also after Guppy changed from single-read fast5 files, if I remember correctly), but I am a bit confused if "bulk" means something more specific here.

For example, I read the example bulk5 you provide is 21 Gb. Does that mean bulk-fast5 = one fast5 for the whole run? In which case I might not have the right data format.

having some trouble with whale_watch.py ?

Im trying to use this whale_watch.py script to find fused reads.

the error message is this ::
TypeError: unsupported operand type(s) for -: 'str' and 'str'

I think this where the error is coming from ::
File "whale_watch.py", line 36, in fuse_reads
ss['diff'] = ss['start_time'].shift(-1) - (ss['start_time'] + ss['duration'])

Thanks for any assistance!
This might be something basic on my end .. in which case apologies.

Here is the whole error message i am getting ::

Traceback (most recent call last):
File "whale_watch.py", line 289, in
main()
File "whale_watch.py", line 11, in main
fused_df, ss, fused_read_ids = fuse_reads(args.summary, args.paf, args.distance, args.top, args.alt, args.debug)
File "whale_watch.py", line 36, in fuse_reads
ss['diff'] = ss['start_time'].shift(-1) - (ss['start_time'] + ss['duration'])
File "/usr/lib/anaconda3/lib/python3.6/site-packages/pandas/core/ops.py", line 1069, in wrapper
result = safe_na_op(lvalues, rvalues)
File "/usr/lib/anaconda3/lib/python3.6/site-packages/pandas/core/ops.py", line 1037, in safe_na_op
lambda x: op(x, rvalues))
File "pandas/_libs/algos_common_helper.pxi", line 1212, in pandas._libs.algos.arrmap_object
File "/usr/lib/anaconda3/lib/python3.6/site-packages/pandas/core/ops.py", line 1037, in
lambda x: op(x, rvalues))
TypeError: unsupported operand type(s) for -: 'str' and 'str'

exporting signal data

Hi, I'm a researcher studying Nanopore data.
I have two questions about bulkvis tool.

  1. Can I export numeric signal data in .csv or other format which is visualized as a graph?
  2. Can I select signal data in fast5 file by genomic position that can be annotated with fastq files?

Thank you for reading my questions and please answer me.

cDNA/dRNA splice mapping

Is Bulkvis designed to handle cDNA/dRNA splice alignment to the genome? What about aligning the cDNA/dRNA reads to reference transcriptome?

Or is the expectation to have gDNA as input?

Cheers,
Baraa

Cannot install

Hi,
I am keen on using bulkvis.
However, I can not install it.
While I try to run "pip install -r requirements.txt".
The following error will occure.

ERROR: Command errored out with exit status 1:
command: /Users/pingliu/envs/bulkvis/bin/python3 /Users/pingliu/envs/bulkvis/lib/python3.7/site-packages/pip install --ignore-installed --no-user --prefix /private/var/folders/mz/4n1rp84s60j5p6mrh_7xgy5h0000gn/T/pip-build-env-wgn967th/overlay --no-warn-script-location --no-binary :none: --only-binary :none: -i https://pypi.org/simple -- wheel setuptools Cython 'numpy==1.9.3; python_version=='"'"'3.5'"'"'' 'numpy==1.12.1; python_version=='"'"'3.6'"'"'' 'numpy==1.13.1; python_version>='"'"'3.7'"'"''
cwd: None
Complete output (4381 lines):
Ignoring numpy: markers 'python_version == "3.5"' don't match your environment
Ignoring numpy: markers 'python_version == "3.6"' don't match your environment
Collecting wheel
Using cached wheel-0.34.2-py2.py3-none-any.whl (26 kB)
Collecting setuptools
Using cached setuptools-47.3.1-py3-none-any.whl (582 kB)
Collecting Cython
Using cached Cython-0.29.20-cp37-cp37m-macosx_10_9_x86_64.whl (1.9 MB)
Collecting numpy==1.13.1
Using cached numpy-1.13.1.zip (5.0 MB)
Using legacy setup.py install for numpy, since package 'wheel' is not installed.
Installing collected packages: wheel, setuptools, Cython, numpy
Running setup.py install for numpy: started
Running setup.py install for numpy: still running...
Running setup.py install for numpy: finished with status 'error'

Can you help me to work out this problem.
Thank you in advance.

h5py

I got this message:
KeyError: "Unable to open object (object 'IntermediateData' doesn't exist)"
full details:
command is:
$ bulkvis>python utils\set_config.py -b ..\test\FAK47977_78c30614cc65b2b97c46e55b3057dc404e30b4db_0.fast5 -i ..\test -m ..\test -e ..\test -c config.ini
Error message
Traceback (most recent call last):
File "utils\set_config.py", line 121, in
main()
File "utils\set_config.py", line 74, in main
int_data_path = bulkfile["IntermediateData"]
File "h5py_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "C:\Users\moham\bulkvis\lib\site-packages\h5py_hl\group.py", line 262, in getitem
oid = h5o.open(self.id, self._e(name), lapl=self._lapl)
File "h5py_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "h5py\h5o.pyx", line 190, in h5py.h5o.open
KeyError: "Unable to open object (object 'IntermediateData' doesn't exist)"

seem a problem with h5py. I tried it on 3 machines (one Windows and two Linux) and I got the same problem. Other tools that use h5py work fine. How can I fix this problem?

Regards,

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.