GithubHelp home page GithubHelp logo

ggirelli / gpseqc Goto Github PK

View Code? Open in Web Editor NEW
2.0 5.0 2.0 253 KB

Nuclear centrality estimation from GPSeq experiment.

License: MIT License

Python 100.00%
centrality nuclear-centrality nuclear-architecture genome-architecture sequencing gpseq

gpseqc's Issues

Sorting error in bedtools intersect

Before submitting an issue, please be sure to

This issue affects

  • Centrality estimation (gpseqc_estimate)
  • Ranks comparison (gpseqc_compare)

What did you do (e.g., steps to reproduce)

gpseqc_estimate '/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK93_1min_GG__cutsiteLoc-umiCount.bed' '/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK94_5min_GG__cutsiteLoc-umiCount.bed' '/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK95_10min_GG__cutsiteLoc-umiCount.bed' '/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK96_15min_GG__cutsiteLoc-umiCount.bed' '/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK97_30min_GG__cutsiteLoc-umiCount.bed' '/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK98_on_GG__cutsiteLoc-umiCount.bed' -o '/media/Data/Quim/No_exclusion/1Mb/BICRO55/output' -s 1000000 -r BICRO55_1Mb_allMetrics -t 8 -g 10000

What did you expect to happen?

work

What happened instead?

error

Additional information

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/dist-packages/joblib/_parallel_backends.py", line 350, in __call__
    return self.func(*args, **kwargs)
  File "/usr/local/lib/python3.5/dist-packages/joblib/parallel.py", line 131, in __call__
    return [func(*args, **kwargs) for func, args, kwargs in self.items]
  File "/usr/local/lib/python3.5/dist-packages/joblib/parallel.py", line 131, in <listcomp>
    return [func(*args, **kwargs) for func, args, kwargs in self.items]
  File "/usr/local/bin/gpseqc_estimate", line 399, in do_intersect
    bedfiles[i] = bed.to_combined_bins(groups, bedfiles[i])
  File "/usr/local/lib/python3.5/dist-packages/gpseqc/bed.py", line 245, in to_combined_bins
    isect = bins.intersect(bed, wao = True, sorted = True)
  File "/usr/local/lib/python3.5/dist-packages/pybedtools/bedtool.py", line 806, in decorated
    result = method(self, *args, **kwargs)
  File "/usr/local/lib/python3.5/dist-packages/pybedtools/bedtool.py", line 337, in wrapped
    decode_output=decode_output,
  File "/usr/local/lib/python3.5/dist-packages/pybedtools/helpers.py", line 356, in call_bedtools
    raise BEDToolsError(subprocess.list2cmdline(cmds), stderr)
pybedtools.helpers.BEDToolsError: 
Command was:

	bedtools intersect -wao -sorted -b /tmp/pybedtools.zwcndsjx.tmp -a /tmp/pybedtools.kk9snf52.tmp

Error message was:
ERROR: Sort order was unspecified, and file /tmp/pybedtools.kk9snf52.tmp is not sorted lexicographically.
       Please re-reun with the -g option for a genome file.
       See documentation for details.


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.5/multiprocessing/pool.py", line 119, in worker
    result = (True, func(*args, **kwds))
  File "/usr/local/lib/python3.5/dist-packages/joblib/_parallel_backends.py", line 359, in __call__
    raise TransportableException(text, e_type)
joblib.my_exceptions.TransportableException: TransportableException
___________________________________________________________________________
BEDToolsError                                      Wed May  2 12:46:35 2018
PID: 5153                                    Python 3.5.2: /usr/bin/python3
...........................................................................
/usr/local/lib/python3.5/dist-packages/joblib/parallel.py in __call__(self=<joblib.parallel.BatchedCalls object>)
    126     def __init__(self, iterator_slice):
    127         self.items = list(iterator_slice)
    128         self._size = len(self.items)
    129 
    130     def __call__(self):
--> 131         return [func(*args, **kwargs) for func, args, kwargs in self.items]
        self.items = [(<function do_intersect>, (0, [<BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK93_1min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK94_5min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK95_10min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK96_15min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK97_30min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK98_on_GG__cutsiteLoc-umiCount.bed)>], <BedTool(/tmp/pybedtools.38yeue3t.tmp)>, 'bins.size1000000.step1000000.group10000.csm3', Namespace(T='/tmp', bedfile=['/media/Data/Quim/N...='BICRO55_1Mb_allMetrics.', suffix='', threads=8)), {})]
    132 
    133     def __len__(self):
    134         return self._size
    135 

...........................................................................
/usr/local/lib/python3.5/dist-packages/joblib/parallel.py in <listcomp>(.0=<list_iterator object>)
    126     def __init__(self, iterator_slice):
    127         self.items = list(iterator_slice)
    128         self._size = len(self.items)
    129 
    130     def __call__(self):
--> 131         return [func(*args, **kwargs) for func, args, kwargs in self.items]
        func = <function do_intersect>
        args = (0, [<BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK93_1min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK94_5min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK95_10min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK96_15min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK97_30min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK98_on_GG__cutsiteLoc-umiCount.bed)>], <BedTool(/tmp/pybedtools.38yeue3t.tmp)>, 'bins.size1000000.step1000000.group10000.csm3', Namespace(T='/tmp', bedfile=['/media/Data/Quim/N...='BICRO55_1Mb_allMetrics.', suffix='', threads=8))
        kwargs = {}
    132 
    133     def __len__(self):
    134         return self._size
    135 

...........................................................................
/usr/local/bin/gpseqc_estimate in do_intersect(i=0, bedfiles=[<BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK93_1min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK94_5min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK95_10min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK96_15min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK97_30min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK98_on_GG__cutsiteLoc-umiCount.bed)>], groups=<BedTool(/tmp/pybedtools.38yeue3t.tmp)>, descr='bins.size1000000.step1000000.group10000.csm3', args=Namespace(T='/tmp', bedfile=['/media/Data/Quim/N...='BICRO55_1Mb_allMetrics.', suffix='', threads=8))
    394 	groups = bed.mk_windows(chr_sizes, args.group_size, args.group_size)
    395 	if args.debug_mode: bed_saveas(groups, "groups.%s.bed" % descr, args)
    396 
    397 	# Intersect
    398 	def do_intersect(i, bedfiles, groups, descr, args):
--> 399 		bedfiles[i] = bed.to_combined_bins(groups, bedfiles[i])
        bedfiles = [<BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK93_1min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK94_5min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK95_10min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK96_15min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK97_30min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK98_on_GG__cutsiteLoc-umiCount.bed)>]
        i = 0
        groups = <BedTool(/tmp/pybedtools.38yeue3t.tmp)>
    400 		if args.debug_mode: bed_saveas(bedfiles[i], "grouped.%s.%s.tsv" % (
    401 			descr, os.path.basename(args.bedfile[i])), args)
    402 		return(bedfiles[i])
    403 

...........................................................................
/usr/local/lib/python3.5/dist-packages/gpseqc/bed.py in to_combined_bins(bins=<BedTool(/tmp/pybedtools.kk9snf52.tmp)>, bed=<BedTool(/tmp/pybedtools.zwcndsjx.tmp)>, fcomb=<function to_combined_bins.<locals>.<lambda>>)
    240 
    241     # Default combination style: sum
    242     if type(None) == type(fcomb): fcomb = lambda x, y: x + y
    243 
    244     # Perform intersection
--> 245     isect = bins.intersect(bed, wao = True, sorted = True)
        isect = undefined
        bins.intersect = <bound method BedTool._log_to_history.<locals>.decorated of <BedTool(/tmp/pybedtools.kk9snf52.tmp)>>
        bed = <BedTool(/tmp/pybedtools.zwcndsjx.tmp)>
    246 
    247     d2 = {}
    248     bi = 1  # Region counter
    249 

...........................................................................
/usr/local/lib/python3.5/dist-packages/pybedtools/bedtool.py in decorated(self=<BedTool(/tmp/pybedtools.kk9snf52.tmp)>, *args=(<BedTool(/tmp/pybedtools.zwcndsjx.tmp)>,), **kwargs={'sorted': True, 'wao': True})
    801         """
    802         def decorated(self, *args, **kwargs):
    803 
    804             # this calls the actual method in the first place; *result* is
    805             # whatever you get back
--> 806             result = method(self, *args, **kwargs)
        result = undefined
        self = <BedTool(/tmp/pybedtools.kk9snf52.tmp)>
        args = (<BedTool(/tmp/pybedtools.zwcndsjx.tmp)>,)
        kwargs = {'sorted': True, 'wao': True}
    807 
    808             # add appropriate tags
    809             parent_tag = self._tag
    810             result_tag = result._tag

...........................................................................
/usr/local/lib/python3.5/dist-packages/pybedtools/bedtool.py in wrapped(self=<BedTool(/tmp/pybedtools.kk9snf52.tmp)>, *args=(<BedTool(/tmp/pybedtools.zwcndsjx.tmp)>,), **kwargs={'a': '/tmp/pybedtools.kk9snf52.tmp', 'b': <BedTool(/tmp/pybedtools.zwcndsjx.tmp)>, 'sorted': True, 'wao': True})
    332             decode_output = not result_is_bam
    333 
    334             # Do the actual call
    335             stream = call_bedtools(cmds, tmp, stdin=stdin,
    336                                    check_stderr=check_stderr,
--> 337                                    decode_output=decode_output,
        decode_output = True
    338                                    )
    339 
    340             if does_not_return_bedtool:
    341                 return does_not_return_bedtool(stream, **kwargs)

...........................................................................
/usr/local/lib/python3.5/dist-packages/pybedtools/helpers.py in call_bedtools(cmds=['bedtools', 'intersect', '-wao', '-sorted', '-b', '/tmp/pybedtools.zwcndsjx.tmp', '-a', '/tmp/pybedtools.kk9snf52.tmp'], tmpfn='/tmp/pybedtools.2141z5we.tmp', stdin=None, check_stderr=None, decode_output=True, encode_input=True)
    351             if isinstance(stderr, bytes):
    352                 stderr = stderr.decode('UTF_8')
    353             if len(stderr) > 20 and "WARNING" in stderr[:20]:
    354                 sys.stderr.write(stderr)
    355             else:
--> 356                 raise BEDToolsError(subprocess.list2cmdline(cmds), stderr)
        cmds = ['bedtools', 'intersect', '-wao', '-sorted', '-b', '/tmp/pybedtools.zwcndsjx.tmp', '-a', '/tmp/pybedtools.kk9snf52.tmp']
        stderr = 'ERROR: Sort order was unspecified, and file /tmp...nome file.\n       See documentation for details.\n'
    357 
    358 
    359 
    360     except (OSError, IOError) as err:

BEDToolsError: 
Command was:

	bedtools intersect -wao -sorted -b /tmp/pybedtools.zwcndsjx.tmp -a /tmp/pybedtools.kk9snf52.tmp

Error message was:
ERROR: Sort order was unspecified, and file /tmp/pybedtools.kk9snf52.tmp is not sorted lexicographically.
       Please re-reun with the -g option for a genome file.
       See documentation for details.

___________________________________________________________________________
"""

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/dist-packages/joblib/parallel.py", line 699, in retrieve
    self._output.extend(job.get(timeout=self.timeout))
  File "/usr/lib/python3.5/multiprocessing/pool.py", line 608, in get
    raise self._value
joblib.my_exceptions.TransportableException: TransportableException
___________________________________________________________________________
BEDToolsError                                      Wed May  2 12:46:35 2018
PID: 5153                                    Python 3.5.2: /usr/bin/python3
...........................................................................
/usr/local/lib/python3.5/dist-packages/joblib/parallel.py in __call__(self=<joblib.parallel.BatchedCalls object>)
    126     def __init__(self, iterator_slice):
    127         self.items = list(iterator_slice)
    128         self._size = len(self.items)
    129 
    130     def __call__(self):
--> 131         return [func(*args, **kwargs) for func, args, kwargs in self.items]
        self.items = [(<function do_intersect>, (0, [<BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK93_1min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK94_5min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK95_10min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK96_15min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK97_30min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK98_on_GG__cutsiteLoc-umiCount.bed)>], <BedTool(/tmp/pybedtools.38yeue3t.tmp)>, 'bins.size1000000.step1000000.group10000.csm3', Namespace(T='/tmp', bedfile=['/media/Data/Quim/N...='BICRO55_1Mb_allMetrics.', suffix='', threads=8)), {})]
    132 
    133     def __len__(self):
    134         return self._size
    135 

...........................................................................
/usr/local/lib/python3.5/dist-packages/joblib/parallel.py in <listcomp>(.0=<list_iterator object>)
    126     def __init__(self, iterator_slice):
    127         self.items = list(iterator_slice)
    128         self._size = len(self.items)
    129 
    130     def __call__(self):
--> 131         return [func(*args, **kwargs) for func, args, kwargs in self.items]
        func = <function do_intersect>
        args = (0, [<BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK93_1min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK94_5min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK95_10min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK96_15min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK97_30min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK98_on_GG__cutsiteLoc-umiCount.bed)>], <BedTool(/tmp/pybedtools.38yeue3t.tmp)>, 'bins.size1000000.step1000000.group10000.csm3', Namespace(T='/tmp', bedfile=['/media/Data/Quim/N...='BICRO55_1Mb_allMetrics.', suffix='', threads=8))
        kwargs = {}
    132 
    133     def __len__(self):
    134         return self._size
    135 

...........................................................................
/usr/local/bin/gpseqc_estimate in do_intersect(i=0, bedfiles=[<BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK93_1min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK94_5min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK95_10min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK96_15min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK97_30min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK98_on_GG__cutsiteLoc-umiCount.bed)>], groups=<BedTool(/tmp/pybedtools.38yeue3t.tmp)>, descr='bins.size1000000.step1000000.group10000.csm3', args=Namespace(T='/tmp', bedfile=['/media/Data/Quim/N...='BICRO55_1Mb_allMetrics.', suffix='', threads=8))
    394 	groups = bed.mk_windows(chr_sizes, args.group_size, args.group_size)
    395 	if args.debug_mode: bed_saveas(groups, "groups.%s.bed" % descr, args)
    396 
    397 	# Intersect
    398 	def do_intersect(i, bedfiles, groups, descr, args):
--> 399 		bedfiles[i] = bed.to_combined_bins(groups, bedfiles[i])
        bedfiles = [<BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK93_1min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK94_5min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK95_10min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK96_15min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK97_30min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK98_on_GG__cutsiteLoc-umiCount.bed)>]
        i = 0
        groups = <BedTool(/tmp/pybedtools.38yeue3t.tmp)>
    400 		if args.debug_mode: bed_saveas(bedfiles[i], "grouped.%s.%s.tsv" % (
    401 			descr, os.path.basename(args.bedfile[i])), args)
    402 		return(bedfiles[i])
    403 

...........................................................................
/usr/local/lib/python3.5/dist-packages/gpseqc/bed.py in to_combined_bins(bins=<BedTool(/tmp/pybedtools.kk9snf52.tmp)>, bed=<BedTool(/tmp/pybedtools.zwcndsjx.tmp)>, fcomb=<function to_combined_bins.<locals>.<lambda>>)
    240 
    241     # Default combination style: sum
    242     if type(None) == type(fcomb): fcomb = lambda x, y: x + y
    243 
    244     # Perform intersection
--> 245     isect = bins.intersect(bed, wao = True, sorted = True)
        isect = undefined
        bins.intersect = <bound method BedTool._log_to_history.<locals>.decorated of <BedTool(/tmp/pybedtools.kk9snf52.tmp)>>
        bed = <BedTool(/tmp/pybedtools.zwcndsjx.tmp)>
    246 
    247     d2 = {}
    248     bi = 1  # Region counter
    249 

...........................................................................
/usr/local/lib/python3.5/dist-packages/pybedtools/bedtool.py in decorated(self=<BedTool(/tmp/pybedtools.kk9snf52.tmp)>, *args=(<BedTool(/tmp/pybedtools.zwcndsjx.tmp)>,), **kwargs={'sorted': True, 'wao': True})
    801         """
    802         def decorated(self, *args, **kwargs):
    803 
    804             # this calls the actual method in the first place; *result* is
    805             # whatever you get back
--> 806             result = method(self, *args, **kwargs)
        result = undefined
        self = <BedTool(/tmp/pybedtools.kk9snf52.tmp)>
        args = (<BedTool(/tmp/pybedtools.zwcndsjx.tmp)>,)
        kwargs = {'sorted': True, 'wao': True}
    807 
    808             # add appropriate tags
    809             parent_tag = self._tag
    810             result_tag = result._tag

...........................................................................
/usr/local/lib/python3.5/dist-packages/pybedtools/bedtool.py in wrapped(self=<BedTool(/tmp/pybedtools.kk9snf52.tmp)>, *args=(<BedTool(/tmp/pybedtools.zwcndsjx.tmp)>,), **kwargs={'a': '/tmp/pybedtools.kk9snf52.tmp', 'b': <BedTool(/tmp/pybedtools.zwcndsjx.tmp)>, 'sorted': True, 'wao': True})
    332             decode_output = not result_is_bam
    333 
    334             # Do the actual call
    335             stream = call_bedtools(cmds, tmp, stdin=stdin,
    336                                    check_stderr=check_stderr,
--> 337                                    decode_output=decode_output,
        decode_output = True
    338                                    )
    339 
    340             if does_not_return_bedtool:
    341                 return does_not_return_bedtool(stream, **kwargs)

...........................................................................
/usr/local/lib/python3.5/dist-packages/pybedtools/helpers.py in call_bedtools(cmds=['bedtools', 'intersect', '-wao', '-sorted', '-b', '/tmp/pybedtools.zwcndsjx.tmp', '-a', '/tmp/pybedtools.kk9snf52.tmp'], tmpfn='/tmp/pybedtools.2141z5we.tmp', stdin=None, check_stderr=None, decode_output=True, encode_input=True)
    351             if isinstance(stderr, bytes):
    352                 stderr = stderr.decode('UTF_8')
    353             if len(stderr) > 20 and "WARNING" in stderr[:20]:
    354                 sys.stderr.write(stderr)
    355             else:
--> 356                 raise BEDToolsError(subprocess.list2cmdline(cmds), stderr)
        cmds = ['bedtools', 'intersect', '-wao', '-sorted', '-b', '/tmp/pybedtools.zwcndsjx.tmp', '-a', '/tmp/pybedtools.kk9snf52.tmp']
        stderr = 'ERROR: Sort order was unspecified, and file /tmp...nome file.\n       See documentation for details.\n'
    357 
    358 
    359 
    360     except (OSError, IOError) as err:

BEDToolsError: 
Command was:

	bedtools intersect -wao -sorted -b /tmp/pybedtools.zwcndsjx.tmp -a /tmp/pybedtools.kk9snf52.tmp

Error message was:
ERROR: Sort order was unspecified, and file /tmp/pybedtools.kk9snf52.tmp is not sorted lexicographically.
       Please re-reun with the -g option for a genome file.
       See documentation for details.

___________________________________________________________________________

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/bin/gpseqc_estimate", line 410, in <module>
    for i in range(len(bedfiles)))
  File "/usr/local/lib/python3.5/dist-packages/joblib/parallel.py", line 789, in __call__
    self.retrieve()
  File "/usr/local/lib/python3.5/dist-packages/joblib/parallel.py", line 740, in retrieve
    raise exception
joblib.my_exceptions.JoblibBEDToolsError: JoblibBEDToolsError
___________________________________________________________________________
Multiprocessing exception:
...........................................................................
/usr/local/bin/gpseqc_estimate in <module>()
    405 		for i in tqdm(range(len(bedfiles))):
    406 			do_intersect(i, bedfiles, groups, descr, args)
    407 	else:
    408 		bedfiles =  Parallel(n_jobs = args.threads, verbose = 11)(
    409 			delayed(do_intersect)(i, bedfiles, groups, descr, args)
--> 410 			for i in range(len(bedfiles)))
    411 
    412 # (4) Normalize over last condition --------------------------------------------
    413 
    414 if args.normalize:

...........................................................................
/usr/local/lib/python3.5/dist-packages/joblib/parallel.py in __call__(self=Parallel(n_jobs=8), iterable=<generator object <genexpr>>)
    784             if pre_dispatch == "all" or n_jobs == 1:
    785                 # The iterable was consumed all at once by the above for loop.
    786                 # No need to wait for async callbacks to trigger to
    787                 # consumption.
    788                 self._iterating = False
--> 789             self.retrieve()
        self.retrieve = <bound method Parallel.retrieve of Parallel(n_jobs=8)>
    790             # Make sure that we get a last message telling us we are done
    791             elapsed_time = time.time() - self._start_time
    792             self._print('Done %3i out of %3i | elapsed: %s finished',
    793                         (len(self._output), len(self._output),

---------------------------------------------------------------------------
Sub-process traceback:
---------------------------------------------------------------------------
BEDToolsError                                      Wed May  2 12:46:35 2018
PID: 5153                                    Python 3.5.2: /usr/bin/python3
...........................................................................
/usr/local/lib/python3.5/dist-packages/joblib/parallel.py in __call__(self=<joblib.parallel.BatchedCalls object>)
    126     def __init__(self, iterator_slice):
    127         self.items = list(iterator_slice)
    128         self._size = len(self.items)
    129 
    130     def __call__(self):
--> 131         return [func(*args, **kwargs) for func, args, kwargs in self.items]
        self.items = [(<function do_intersect>, (0, [<BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK93_1min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK94_5min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK95_10min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK96_15min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK97_30min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK98_on_GG__cutsiteLoc-umiCount.bed)>], <BedTool(/tmp/pybedtools.38yeue3t.tmp)>, 'bins.size1000000.step1000000.group10000.csm3', Namespace(T='/tmp', bedfile=['/media/Data/Quim/N...='BICRO55_1Mb_allMetrics.', suffix='', threads=8)), {})]
    132 
    133     def __len__(self):
    134         return self._size
    135 

...........................................................................
/usr/local/lib/python3.5/dist-packages/joblib/parallel.py in <listcomp>(.0=<list_iterator object>)
    126     def __init__(self, iterator_slice):
    127         self.items = list(iterator_slice)
    128         self._size = len(self.items)
    129 
    130     def __call__(self):
--> 131         return [func(*args, **kwargs) for func, args, kwargs in self.items]
        func = <function do_intersect>
        args = (0, [<BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK93_1min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK94_5min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK95_10min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK96_15min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK97_30min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK98_on_GG__cutsiteLoc-umiCount.bed)>], <BedTool(/tmp/pybedtools.38yeue3t.tmp)>, 'bins.size1000000.step1000000.group10000.csm3', Namespace(T='/tmp', bedfile=['/media/Data/Quim/N...='BICRO55_1Mb_allMetrics.', suffix='', threads=8))
        kwargs = {}
    132 
    133     def __len__(self):
    134         return self._size
    135 

...........................................................................
/usr/local/bin/gpseqc_estimate in do_intersect(i=0, bedfiles=[<BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK93_1min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK94_5min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK95_10min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK96_15min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK97_30min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK98_on_GG__cutsiteLoc-umiCount.bed)>], groups=<BedTool(/tmp/pybedtools.38yeue3t.tmp)>, descr='bins.size1000000.step1000000.group10000.csm3', args=Namespace(T='/tmp', bedfile=['/media/Data/Quim/N...='BICRO55_1Mb_allMetrics.', suffix='', threads=8))
    394 	groups = bed.mk_windows(chr_sizes, args.group_size, args.group_size)
    395 	if args.debug_mode: bed_saveas(groups, "groups.%s.bed" % descr, args)
    396 
    397 	# Intersect
    398 	def do_intersect(i, bedfiles, groups, descr, args):
--> 399 		bedfiles[i] = bed.to_combined_bins(groups, bedfiles[i])
        bedfiles = [<BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK93_1min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK94_5min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK95_10min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK96_15min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK97_30min_GG__cutsiteLoc-umiCount.bed)>, <BedTool(/media/Data/Quim/No_exclusion/1Mb/BICRO55/input/TK98_on_GG__cutsiteLoc-umiCount.bed)>]
        i = 0
        groups = <BedTool(/tmp/pybedtools.38yeue3t.tmp)>
    400 		if args.debug_mode: bed_saveas(bedfiles[i], "grouped.%s.%s.tsv" % (
    401 			descr, os.path.basename(args.bedfile[i])), args)
    402 		return(bedfiles[i])
    403 

...........................................................................
/usr/local/lib/python3.5/dist-packages/gpseqc/bed.py in to_combined_bins(bins=<BedTool(/tmp/pybedtools.kk9snf52.tmp)>, bed=<BedTool(/tmp/pybedtools.zwcndsjx.tmp)>, fcomb=<function to_combined_bins.<locals>.<lambda>>)
    240 
    241     # Default combination style: sum
    242     if type(None) == type(fcomb): fcomb = lambda x, y: x + y
    243 
    244     # Perform intersection
--> 245     isect = bins.intersect(bed, wao = True, sorted = True)
        isect = undefined
        bins.intersect = <bound method BedTool._log_to_history.<locals>.decorated of <BedTool(/tmp/pybedtools.kk9snf52.tmp)>>
        bed = <BedTool(/tmp/pybedtools.zwcndsjx.tmp)>
    246 
    247     d2 = {}
    248     bi = 1  # Region counter
    249 

...........................................................................
/usr/local/lib/python3.5/dist-packages/pybedtools/bedtool.py in decorated(self=<BedTool(/tmp/pybedtools.kk9snf52.tmp)>, *args=(<BedTool(/tmp/pybedtools.zwcndsjx.tmp)>,), **kwargs={'sorted': True, 'wao': True})
    801         """
    802         def decorated(self, *args, **kwargs):
    803 
    804             # this calls the actual method in the first place; *result* is
    805             # whatever you get back
--> 806             result = method(self, *args, **kwargs)
        result = undefined
        self = <BedTool(/tmp/pybedtools.kk9snf52.tmp)>
        args = (<BedTool(/tmp/pybedtools.zwcndsjx.tmp)>,)
        kwargs = {'sorted': True, 'wao': True}
    807 
    808             # add appropriate tags
    809             parent_tag = self._tag
    810             result_tag = result._tag

...........................................................................
/usr/local/lib/python3.5/dist-packages/pybedtools/bedtool.py in wrapped(self=<BedTool(/tmp/pybedtools.kk9snf52.tmp)>, *args=(<BedTool(/tmp/pybedtools.zwcndsjx.tmp)>,), **kwargs={'a': '/tmp/pybedtools.kk9snf52.tmp', 'b': <BedTool(/tmp/pybedtools.zwcndsjx.tmp)>, 'sorted': True, 'wao': True})
    332             decode_output = not result_is_bam
    333 
    334             # Do the actual call
    335             stream = call_bedtools(cmds, tmp, stdin=stdin,
    336                                    check_stderr=check_stderr,
--> 337                                    decode_output=decode_output,
        decode_output = True
    338                                    )
    339 
    340             if does_not_return_bedtool:
    341                 return does_not_return_bedtool(stream, **kwargs)

...........................................................................
/usr/local/lib/python3.5/dist-packages/pybedtools/helpers.py in call_bedtools(cmds=['bedtools', 'intersect', '-wao', '-sorted', '-b', '/tmp/pybedtools.zwcndsjx.tmp', '-a', '/tmp/pybedtools.kk9snf52.tmp'], tmpfn='/tmp/pybedtools.2141z5we.tmp', stdin=None, check_stderr=None, decode_output=True, encode_input=True)
    351             if isinstance(stderr, bytes):
    352                 stderr = stderr.decode('UTF_8')
    353             if len(stderr) > 20 and "WARNING" in stderr[:20]:
    354                 sys.stderr.write(stderr)
    355             else:
--> 356                 raise BEDToolsError(subprocess.list2cmdline(cmds), stderr)
        cmds = ['bedtools', 'intersect', '-wao', '-sorted', '-b', '/tmp/pybedtools.zwcndsjx.tmp', '-a', '/tmp/pybedtools.kk9snf52.tmp']
        stderr = 'ERROR: Sort order was unspecified, and file /tmp...nome file.\n       See documentation for details.\n'
    357 
    358 
    359 
    360     except (OSError, IOError) as err:

BEDToolsError: 
Command was:

	bedtools intersect -wao -sorted -b /tmp/pybedtools.zwcndsjx.tmp -a /tmp/pybedtools.kk9snf52.tmp

Error message was:
ERROR: Sort order was unspecified, and file /tmp/pybedtools.kk9snf52.tmp is not sorted lexicographically.
       Please re-reun with the -g option for a genome file.
       See documentation for details.

Rescaling not working

Before submitting an issue, please be sure to

This issue affects

  • Centrality estimation (gpseqc_estimate)
  • Ranks comparison (gpseqc_compare)

What did you do (e.g., steps to reproduce)

Ran gpseqc_estimate as usual.

What did you expect to happen?

Rescaled output should be rescaled ignoring outliers (IQR method, 1.5 lim, as default).

What happened instead?

All values were used for the rescaling.

Additional information

Got this error during the rescaling.

/home/gire/.local/lib/python3.6/site-packages/numpy/lib/function_base.py:3652: RuntimeWarning: Invalid value encountered in percentile
  interpolation=interpolation)
/home/gire/ownCloud/BiCro/Code/repos/gpseqc/gpseqc/stats.py:87: RuntimeWarning: invalid value encountered in greater_equal
  return np.absolute(IQR_values) >= lim

EMD failed after bootstrap

Before submitting an issue, please be sure to

This issue affects

  • Centrality estimation (gpseqc_estimate)
  • Ranks comparison (gpseqc_compare)

What did you do (e.g., steps to reproduce)

/usr/local/bin/gpseqc_compare /home/bicro/Desktop/Backup/user_folders_HD2/Tomek/BICRO60/Data_output/bed_files/new/10Mb/estimated.bins.size10000000.step10000000.group10000.csm3.tsv /home/bicro/Desktop/Backup/user_folders_HD2/Tomek/BICRO62/Data_output/bed_files/new/10Mb/all conditions/estimated.bins.size10000000.step10000000.group10000.csm3.tsv -o /home/bicro/Desktop/Backup/user_folders_HD2/Tomek/BICRO62/Data_output/bed_files/new/10Mb/all conditions/rankcmp -t 37 -p B62_vs_B60_10Mb_all_conditions -d emd
@2018-05-18 15:20:06.717597

Python 3.6.5 (default, Apr  1 2018, 05:46:30) 
[GCC 7.3.0]

 # GPSeqC - Ranking comparison

 1st rank : /home/bicro/Desktop/Backup/user_folders_HD2/Tomek/BICRO60/Data_output/bed_files/new/10Mb/estimated.bins.size10000000.step10000000.group10000.csm3.tsv
 2nd rank : /home/bicro/Desktop/Backup/user_folders_HD2/Tomek/BICRO62/Data_output/bed_files/new/10Mb/all conditions/estimated.bins.size10000000.step10000000.group10000.csm3.tsv

 Distance : emd
    nIter : 5000
  Threads : 37

    Delim : '   '
   Prefix : 'B62_vs_B60_10Mb_all_conditions.'

What did you expect to happen?

What happened instead?

[Parallel(n_jobs=37)]: Done 4926 tasks      | elapsed: 330.8min
[Parallel(n_jobs=37)]: Done 4927 tasks      | elapsed: 330.9min
[Parallel(n_jobs=37)]: Done 5000 out of 5000 | elapsed: 334.7min finished
> Calculating p-value(s)...
  0%|                                                                                        | 0/225 [00:00<?, ?it/s]/usr/local/lib/python3.6/dist-packages/scipy/stats/stats.py:1800: RuntimeWarning: invalid value encountered in less
  return (np.sum(a < score) + np.sum(a <= score)) * 50 / float(n)
/usr/local/lib/python3.6/dist-packages/scipy/stats/stats.py:1800: RuntimeWarning: invalid value encountered in less_equal
  return (np.sum(a < score) + np.sum(a <= score)) * 50 / float(n)
100%|โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 225/225 [00:13<00:00, 16.27it/s]
Exporting tsv tables...
Plotting...
  0%|                                                                                        | 0/225 [00:00<?, ?it/s]/usr/local/lib/python3.6/dist-packages/numpy/core/_methods.py:29: RuntimeWarning: invalid value encountered in reduce
  return umr_minimum(a, axis, None, out, keepdims)
/usr/local/lib/python3.6/dist-packages/numpy/core/_methods.py:26: RuntimeWarning: invalid value encountered in reduce
  return umr_maximum(a, axis, None, out, keepdims)
Traceback (most recent call last):
  File "/usr/local/bin/gpseqc_compare", line 6, in <module>
    exec(compile(open(__file__).read(), __file__, 'exec'))
  File "/home/bicro/Desktop/ggcode/gpseqc/bin/gpseqc_compare", line 265, in <module>
    title, dlabel, xlim).savefig(outpdf, format = 'pdf')
  File "/home/bicro/Desktop/ggcode/gpseqc/gpseqc/compare.py", line 782, in plot_comparison
    ax.hist(rand_distr, 40, density = True, color = '#fddbc7')
  File "/usr/local/lib/python3.6/dist-packages/matplotlib/__init__.py", line 1855, in inner
    return func(ax, *args, **kwargs)
  File "/usr/local/lib/python3.6/dist-packages/matplotlib/axes/_axes.py", line 6530, in hist
    m, bins = np.histogram(x[i], bins, weights=w[i], **hist_kwargs)
  File "/usr/local/lib/python3.6/dist-packages/numpy/lib/function_base.py", line 667, in histogram
    'max must be larger than min in range parameter.')
ValueError: max must be larger than min in range parameter.
Exception ignored in: <bound method tqdm.__del__ of   0%|                                                                                        | 0/225 [00:00<?, ?it/s]>
Traceback (most recent call last):
  File "/usr/local/lib/python3.6/dist-packages/tqdm/_tqdm.py", line 878, in __del__
  File "/usr/local/lib/python3.6/dist-packages/tqdm/_tqdm.py", line 1097, in close
  File "/usr/local/lib/python3.6/dist-packages/tqdm/_tqdm.py", line 438, in _decr_instances
  File "/usr/lib/python3.6/_weakrefset.py", line 109, in remove
KeyError: <weakref at 0x7f260f2895e8; to 'tqdm' at 0x7f2590290c88>

Additional information

IsADirectoryError: [Errno 21] Is a directory: '/'

Before submitting an issue, please be sure to

This issue affects

  • Centrality estimation (gpseqc_estimate)
  • Ranks comparison (gpseqc_compare)

What did you do (e.g., steps to reproduce)

gpseqc_estimate '/home/bicro/Desktop/B72_LAMB_ranking/bed/TK122_1min_GG__cutsiteLoc-umiCount.bed' '/home/bicro/Desktop/B72_LAMB_ranking/bed/TK123_5min_GG__cutsiteLoc-umiCount.bed' '/home/bicro/Desktop/B72_LAMB_ranking/bed/TK124_10min_GG__cutsiteLoc-umiCount.bed' '/home/bicro/Desktop/B72_LAMB_ranking/bed/TK125_15min_GG__cutsiteLoc-umiCount.bed' '/home/bicro/Desktop/B72_LAMB_ranking/bed/TK126_30min_GG__cutsiteLoc-umiCount.bed' '/home/bicro/Desktop/B72_LAMB_ranking/bed/TK127_ON_GG__cutsiteLoc-umiCount.bed' -o '/home/bicro/Desktop/B72_LAMB_ranking/single_probe_1Mb' -g 10000 -b '/media/bs2-pro/GPSeq/probe_bed/Old/single_probe.1Mb.bed' -r BICRO72_1Mb_allMetrics_single_probe -t 8 -s 1000000

What did you expect to happen?

work

What happened instead?

Identifying chromosomes...
100%|โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 6/6 [00:03<00:00, 1.63it/s]
Generating bins...
Traceback (most recent call last):
File "/home/bicro/.pyenv/versions/general/bin/gpseqc_estimate", line 389, in
bins = pbt.BedTool(args.bin_bed[0])
File "/home/bicro/.pyenv/versions/3.6.5/envs/general/lib/python3.6/site-packages/pybedtools/bedtool.py", line 452, in init
self._isbam = isBAM(fn)
File "/home/bicro/.pyenv/versions/3.6.5/envs/general/lib/python3.6/site-packages/pybedtools/helpers.py", line 128, in isBAM
if isBGZIP(fn) and (gzip.open(fn, 'rb').read(4).decode() == 'BAM\x01'):
File "/home/bicro/.pyenv/versions/3.6.5/envs/general/lib/python3.6/site-packages/pybedtools/helpers.py", line 107, in isBGZIP
header_str = open(fn, 'rb').read(15)
IsADirectoryError: [Errno 21] Is a directory: '/'

Additional information

problem with running microscopy pipeline

Before submitting an issue, please be sure to

This issue affects

  • Centrality estimation (gpseqc_estimate)
  • Ranks comparison (gpseqc_compare)

What did you do (e.g., steps to reproduce)

bicro@orion:~/Desktop/user_folders/Tomek/microscopy_pipeline/IMR90_iTK225-240_080618$ gpseq_anim -d dapi -s cy5 -t 36 --note "TK GPSeq YFISH asynch IMR90 deconvolved, center as percentile" --center-as-percentile -r '/home/bicro/Desktop/user_folders/Tomek/microscopy_pipeline/IMR90_iTK226-241_080618/Data_input_iTK226_233' '/home/bicro/Desktop/user_folders/Tomek/microscopy_pipeline/IMR90_iTK226-241_080618/Data_output_iTK226_233'  --sigma-smooth 0.05 -n
 


---------- SETTING:  VALUE ----------

   Input directory :  /home/bicro/Desktop/user_folders/Tomek/microscopy_pipeline/IMR90_iTK226-241_080618/Data_input_iTK226_233
  Output directory :  /home/bicro/Desktop/user_folders/Tomek/microscopy_pipeline/IMR90_iTK226-241_080618/Data_output_iTK226_233
          Log file :  /home/bicro/Desktop/user_folders/Tomek/microscopy_pipeline/IMR90_iTK226-241_080618/Data_output_iTK226_233/2018-06-08_22:17:31_pyGPSeq.log
  
     Skipped steps :  None
  
      DNA channels :  ('dapi',)
   Signal channels :  ('cy5',)

      Segmentation :  3d
          Analysis :  mid
    Middle section :  largest
       Center perc :  True

Voxel aspect (ZYX) :  (300.0, 216.6, 216.6)
       Aspect unit :  nm
 Minimum Z portion :  0.25
    Minimum radius :  10.00 vx
        Fill holes :  True

    Sigma (smooth) :  0.0500
   Sigma (density) :  0.1000
             #bins :  200

  Condition descr. :  *NONE*

 Nuclear selection :  flat_size sumI

           Threads :  36
              Note :  TK GPSeq YFISH asynch IMR90 deconvolved, center as percentile

            Regexp :  '^(?P<channel_name>[^/]*)\.(?P<channel_id>channel[0-9]+)\.(?P<series_id>series[0-9]+)(?P<ext>(_cmle)?\.tif)$'

   Rescale deconv. :  True
   Normalize dist. :  True
         Debug mod :  False


Traceback (most recent call last):
  File "/usr/local/bin/gpseq_anim", line 6, in <module>
    exec(compile(open(__file__).read(), __file__, 'exec'))
  File "/home/bicro/Desktop/ggcode/pygpseq/bin/gpseq_anim", line 381, in <module>
    OH.write("@%s\n\n" % datetime.datetime.now())
NameError: name 'datetime' is not defined

What did you expect to happen?

to run GPSeq microscopy pipeline

What happened instead?

Didnt start running

Additional information

Missing conditions in combined table.

Before submitting an issue, please be sure to

This issue affects

  • Centrality estimation (gpseqc_estimate)
  • Ranks comparison (gpseqc_compare)

What did you do (e.g., steps to reproduce)

gpseqc_estimate '/home/bicro/Desktop/user_folders_HD2/Quim/1min_exclusion/1Mb/BICRO55/input/TK94_5min_GG__cutsiteLoc-umiCount.bed' '/home/bicro/Desktop/user_folders_HD2/Quim/1min_exclusion/1Mb/BICRO55/input/TK95_10min_GG__cutsiteLoc-umiCount.bed' '/home/bicro/Desktop/user_folders_HD2/Quim/1min_exclusion/1Mb/BICRO55/input/TK96_15min_GG__cutsiteLoc-umiCount.bed' '/home/bicro/Desktop/user_folders_HD2/Quim/1min_exclusion/1Mb/BICRO55/input/TK97_30min_GG__cutsiteLoc-umiCount.bed' '/home/bicro/Desktop/user_folders_HD2/Quim/1min_exclusion/1Mb/BICRO55/input/TK98_on_GG__cutsiteLoc-umiCount.bed' -o '/home/bicro/Desktop/user_folders_HD2/Quim/1min_exclusion/1Mb/BICRO55/output' -s 1000000 -r BICRO55_excluding_1min_1MB_allMetrics -t 7

What did you expect to happen?

What happened instead?

Estimating centrality...
[Parallel(n_jobs=7)]: Batch computation too fast (0.0495s.) Setting batch_size=8.
[Parallel(n_jobs=7)]: Done   4 tasks      | elapsed:    0.1s
[Parallel(n_jobs=7)]: Done  11 tasks      | elapsed:    0.1s
[Parallel(n_jobs=7)]: Done  46 tasks      | elapsed:    0.5s
[Parallel(n_jobs=7)]: Done 118 tasks      | elapsed:    0.9s
[Parallel(n_jobs=7)]: Done 190 tasks      | elapsed:    1.3s
[Parallel(n_jobs=7)]: Done 278 tasks      | elapsed:    1.7s
[Parallel(n_jobs=7)]: Done 366 tasks      | elapsed:    2.2s
[Parallel(n_jobs=7)]: Done 470 tasks      | elapsed:    2.9s
[Parallel(n_jobs=7)]: Done 574 tasks      | elapsed:    3.6s
[Parallel(n_jobs=7)]: Done 694 tasks      | elapsed:    4.2s
[Parallel(n_jobs=7)]: Done 814 tasks      | elapsed:    5.0s
[Parallel(n_jobs=7)]: Done 950 tasks      | elapsed:    5.7s
[Parallel(n_jobs=7)]: Done 1086 tasks      | elapsed:    6.5s
[Parallel(n_jobs=7)]: Done 1238 tasks      | elapsed:    7.4s
[Parallel(n_jobs=7)]: Done 1390 tasks      | elapsed:    8.2s
[Parallel(n_jobs=7)]: Done 1558 tasks      | elapsed:    9.1s
[Parallel(n_jobs=7)]: Done 1726 tasks      | elapsed:   10.2s
[Parallel(n_jobs=7)]: Done 1910 tasks      | elapsed:   11.1s
[Parallel(n_jobs=7)]: Done 2094 tasks      | elapsed:   12.2s
[Parallel(n_jobs=7)]: Done 2294 tasks      | elapsed:   13.3s
[Parallel(n_jobs=7)]: Done 2494 tasks      | elapsed:   14.5s
[Parallel(n_jobs=7)]: Done 2710 tasks      | elapsed:   15.6s
multiprocessing.pool.RemoteTraceback: 
"""
Traceback (most recent call last):
  File "/usr/local/lib/python3.5/dist-packages/joblib/_parallel_backends.py", line 350, in __call__
    return self.func(*args, **kwargs)
  File "/usr/local/lib/python3.5/dist-packages/joblib/parallel.py", line 131, in __call__
    return [func(*args, **kwargs) for func, args, kwargs in self.items]
  File "/usr/local/lib/python3.5/dist-packages/joblib/parallel.py", line 131, in <listcomp>
    return [func(*args, **kwargs) for func, args, kwargs in self.items]
  File "/usr/local/lib/python3.5/dist-packages/gpseqc/centrality.py", line 243, in bin_estimate_single
    orow[m] = est_2p(st, calc_p, lambda x, y: x / y)
  File "/usr/local/lib/python3.5/dist-packages/gpseqc/centrality.py", line 123, in est_2p
    a = f1(st, 0)
  File "/usr/local/lib/python3.5/dist-packages/gpseqc/centrality.py", line 31, in calc_p
    row = st.iloc[ci, :]
  File "/usr/local/lib/python3.5/dist-packages/pandas/core/indexing.py", line 1367, in __getitem__
    return self._getitem_tuple(key)
  File "/usr/local/lib/python3.5/dist-packages/pandas/core/indexing.py", line 1737, in _getitem_tuple
    self._has_valid_tuple(tup)
  File "/usr/local/lib/python3.5/dist-packages/pandas/core/indexing.py", line 203, in _has_valid_tuple
    raise IndexingError('Too many indexers')
pandas.core.indexing.IndexingError: Too many indexers

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.5/multiprocessing/pool.py", line 119, in worker
    result = (True, func(*args, **kwds))
  File "/usr/local/lib/python3.5/dist-packages/joblib/_parallel_backends.py", line 359, in __call__
    raise TransportableException(text, e_type)
joblib.my_exceptions.TransportableException: TransportableException
___________________________________________________________________________
IndexingError                                      Tue Apr 24 13:48:52 2018
PID: 29536                                   Python 3.5.2: /usr/bin/python3
...........................................................................
/usr/local/lib/python3.5/dist-packages/joblib/parallel.py in __call__(self=<joblib.parallel.BatchedCalls object>)
    126     def __init__(self, iterator_slice):
    127         self.items = list(iterator_slice)
    128         self._size = len(self.items)
    129 
    130     def __call__(self):
--> 131         return [func(*args, **kwargs) for func, args, kwargs in self.items]
        self.items = [(<function bin_estimate_single>, (2870,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {}), (<function bin_estimate_single>, (2871,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {}), (<function bin_estimate_single>, (2872,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {}), (<function bin_estimate_single>, (2873,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {}), (<function bin_estimate_single>, (2874,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {}), (<function bin_estimate_single>, (2875,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {}), (<function bin_estimate_single>, (2876,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {}), (<function bin_estimate_single>, (2877,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {})]
    132 
    133     def __len__(self):
    134         return self._size
    135 

...........................................................................
/usr/local/lib/python3.5/dist-packages/joblib/parallel.py in <listcomp>(.0=<list_iterator object>)
    126     def __init__(self, iterator_slice):
    127         self.items = list(iterator_slice)
    128         self._size = len(self.items)
    129 
    130     def __call__(self):
--> 131         return [func(*args, **kwargs) for func, args, kwargs in self.items]
        func = <function bin_estimate_single>
        args = (2877,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f'])
        kwargs = {}
    132 
    133     def __len__(self):
    134         return self._size
    135 

...........................................................................
/usr/local/lib/python3.5/dist-packages/gpseqc/centrality.py in bin_estimate_single(i=2877, df=     chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], mlist=['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f'])
    238 
    239     # Calculate requested metrics
    240     for m in mlist:
    241         # Probability
    242         if m == "prob_2p": # two-points
--> 243             orow[m] = est_2p(st, calc_p, lambda x, y: x / y)
        orow = 'chr6'
        m = 'prob_2p'
        st = 0        chr6
1    87000000
2    88000000
3     ...   596776
8           3
Name: 2877, dtype: object
    244         elif m == "prob_f": # fixed
    245             orow[m] = est_f(st, calc_p, lambda x, y: x / y)
    246         elif m == "prob_g": # global
    247             orow[m] = est_g(st, calc_p, lambda x, y: x / y)

...........................................................................
/usr/local/lib/python3.5/dist-packages/gpseqc/centrality.py in est_2p(st=0        chr6
1    87000000
2    88000000
3     ...   596776
8           3
Name: 2877, dtype: object, f1=<function calc_p>, f2=<function bin_estimate_single.<locals>.<lambda>>)
    118         f2 (fun): function for putting conditions together.
    119 
    120     Returns:
    121         Estimated centrality.
    122     '''
--> 123     a = f1(st, 0)
        a = undefined
        f1 = <function calc_p>
        st = 0        chr6
1    87000000
2    88000000
3     ...   596776
8           3
Name: 2877, dtype: object
    124     b = f1(st, st.shape[0] - 1)
    125     return(f2(b, a))
    126 
    127 def est_f(st, f1, f2):

...........................................................................
/usr/local/lib/python3.5/dist-packages/gpseqc/centrality.py in calc_p(st=0        chr6
1    87000000
2    88000000
3     ...   596776
8           3
Name: 2877, dtype: object, ci=0)
     26 
     27     Returns:
     28         float
     29     '''
     30     assert ci < st.shape[0], "requested condition (index) not found."
---> 31     row = st.iloc[ci, :]
        row = undefined
        st.iloc = <pandas.core.indexing._iLocIndexer object>
        ci = 0
     32     p = (row['cond_nreads'] * row['count'])
     33     p = row['sum'] / p if 0 != p else np.nan
     34     return(p)
     35 

...........................................................................
/usr/local/lib/python3.5/dist-packages/pandas/core/indexing.py in __getitem__(self=<pandas.core.indexing._iLocIndexer object>, key=(0, slice(None, None, None)))
   1362             try:
   1363                 if self._is_scalar_access(key):
   1364                     return self._getitem_scalar(key)
   1365             except (KeyError, IndexError):
   1366                 pass
-> 1367             return self._getitem_tuple(key)
        self._getitem_tuple = <bound method _iLocIndexer._getitem_tuple of <pandas.core.indexing._iLocIndexer object>>
        key = (0, slice(None, None, None))
   1368         else:
   1369             # we by definition only have the 0th axis
   1370             axis = self.axis or 0
   1371 

...........................................................................
/usr/local/lib/python3.5/dist-packages/pandas/core/indexing.py in _getitem_tuple(self=<pandas.core.indexing._iLocIndexer object>, tup=(0, slice(None, None, None)))
   1732 
   1733         return True
   1734 
   1735     def _getitem_tuple(self, tup):
   1736 
-> 1737         self._has_valid_tuple(tup)
        self._has_valid_tuple = <bound method _NDFrameIndexer._has_valid_tuple of <pandas.core.indexing._iLocIndexer object>>
        tup = (0, slice(None, None, None))
   1738         try:
   1739             return self._getitem_lowerdim(tup)
   1740         except:
   1741             pass

...........................................................................
/usr/local/lib/python3.5/dist-packages/pandas/core/indexing.py in _has_valid_tuple(self=<pandas.core.indexing._iLocIndexer object>, key=(0, slice(None, None, None)))
    198 
    199     def _has_valid_tuple(self, key):
    200         """ check the key for valid keys across my indexer """
    201         for i, k in enumerate(key):
    202             if i >= self.obj.ndim:
--> 203                 raise IndexingError('Too many indexers')
    204             if not self._has_valid_type(k, i):
    205                 raise ValueError("Location based indexing can only have "
    206                                  "[{types}] types"
    207                                  .format(types=self._valid_types))

IndexingError: Too many indexers
___________________________________________________________________________
"""

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/dist-packages/joblib/parallel.py", line 699, in retrieve
    self._output.extend(job.get(timeout=self.timeout))
  File "/usr/lib/python3.5/multiprocessing/pool.py", line 608, in get
    raise self._value
joblib.my_exceptions.TransportableException: TransportableException
___________________________________________________________________________
IndexingError                                      Tue Apr 24 13:48:52 2018
PID: 29536                                   Python 3.5.2: /usr/bin/python3
...........................................................................
/usr/local/lib/python3.5/dist-packages/joblib/parallel.py in __call__(self=<joblib.parallel.BatchedCalls object>)
    126     def __init__(self, iterator_slice):
    127         self.items = list(iterator_slice)
    128         self._size = len(self.items)
    129 
    130     def __call__(self):
--> 131         return [func(*args, **kwargs) for func, args, kwargs in self.items]
        self.items = [(<function bin_estimate_single>, (2870,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {}), (<function bin_estimate_single>, (2871,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {}), (<function bin_estimate_single>, (2872,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {}), (<function bin_estimate_single>, (2873,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {}), (<function bin_estimate_single>, (2874,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {}), (<function bin_estimate_single>, (2875,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {}), (<function bin_estimate_single>, (2876,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {}), (<function bin_estimate_single>, (2877,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {})]
    132 
    133     def __len__(self):
    134         return self._size
    135 

...........................................................................
/usr/local/lib/python3.5/dist-packages/joblib/parallel.py in <listcomp>(.0=<list_iterator object>)
    126     def __init__(self, iterator_slice):
    127         self.items = list(iterator_slice)
    128         self._size = len(self.items)
    129 
    130     def __call__(self):
--> 131         return [func(*args, **kwargs) for func, args, kwargs in self.items]
        func = <function bin_estimate_single>
        args = (2877,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f'])
        kwargs = {}
    132 
    133     def __len__(self):
    134         return self._size
    135 

...........................................................................
/usr/local/lib/python3.5/dist-packages/gpseqc/centrality.py in bin_estimate_single(i=2877, df=     chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], mlist=['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f'])
    238 
    239     # Calculate requested metrics
    240     for m in mlist:
    241         # Probability
    242         if m == "prob_2p": # two-points
--> 243             orow[m] = est_2p(st, calc_p, lambda x, y: x / y)
        orow = 'chr6'
        m = 'prob_2p'
        st = 0        chr6
1    87000000
2    88000000
3     ...   596776
8           3
Name: 2877, dtype: object
    244         elif m == "prob_f": # fixed
    245             orow[m] = est_f(st, calc_p, lambda x, y: x / y)
    246         elif m == "prob_g": # global
    247             orow[m] = est_g(st, calc_p, lambda x, y: x / y)

...........................................................................
/usr/local/lib/python3.5/dist-packages/gpseqc/centrality.py in est_2p(st=0        chr6
1    87000000
2    88000000
3     ...   596776
8           3
Name: 2877, dtype: object, f1=<function calc_p>, f2=<function bin_estimate_single.<locals>.<lambda>>)
    118         f2 (fun): function for putting conditions together.
    119 
    120     Returns:
    121         Estimated centrality.
    122     '''
--> 123     a = f1(st, 0)
        a = undefined
        f1 = <function calc_p>
        st = 0        chr6
1    87000000
2    88000000
3     ...   596776
8           3
Name: 2877, dtype: object
    124     b = f1(st, st.shape[0] - 1)
    125     return(f2(b, a))
    126 
    127 def est_f(st, f1, f2):

...........................................................................
/usr/local/lib/python3.5/dist-packages/gpseqc/centrality.py in calc_p(st=0        chr6
1    87000000
2    88000000
3     ...   596776
8           3
Name: 2877, dtype: object, ci=0)
     26 
     27     Returns:
     28         float
     29     '''
     30     assert ci < st.shape[0], "requested condition (index) not found."
---> 31     row = st.iloc[ci, :]
        row = undefined
        st.iloc = <pandas.core.indexing._iLocIndexer object>
        ci = 0
     32     p = (row['cond_nreads'] * row['count'])
     33     p = row['sum'] / p if 0 != p else np.nan
     34     return(p)
     35 

...........................................................................
/usr/local/lib/python3.5/dist-packages/pandas/core/indexing.py in __getitem__(self=<pandas.core.indexing._iLocIndexer object>, key=(0, slice(None, None, None)))
   1362             try:
   1363                 if self._is_scalar_access(key):
   1364                     return self._getitem_scalar(key)
   1365             except (KeyError, IndexError):
   1366                 pass
-> 1367             return self._getitem_tuple(key)
        self._getitem_tuple = <bound method _iLocIndexer._getitem_tuple of <pandas.core.indexing._iLocIndexer object>>
        key = (0, slice(None, None, None))
   1368         else:
   1369             # we by definition only have the 0th axis
   1370             axis = self.axis or 0
   1371 

...........................................................................
/usr/local/lib/python3.5/dist-packages/pandas/core/indexing.py in _getitem_tuple(self=<pandas.core.indexing._iLocIndexer object>, tup=(0, slice(None, None, None)))
   1732 
   1733         return True
   1734 
   1735     def _getitem_tuple(self, tup):
   1736 
-> 1737         self._has_valid_tuple(tup)
        self._has_valid_tuple = <bound method _NDFrameIndexer._has_valid_tuple of <pandas.core.indexing._iLocIndexer object>>
        tup = (0, slice(None, None, None))
   1738         try:
   1739             return self._getitem_lowerdim(tup)
   1740         except:
   1741             pass

...........................................................................
/usr/local/lib/python3.5/dist-packages/pandas/core/indexing.py in _has_valid_tuple(self=<pandas.core.indexing._iLocIndexer object>, key=(0, slice(None, None, None)))
    198 
    199     def _has_valid_tuple(self, key):
    200         """ check the key for valid keys across my indexer """
    201         for i, k in enumerate(key):
    202             if i >= self.obj.ndim:
--> 203                 raise IndexingError('Too many indexers')
    204             if not self._has_valid_type(k, i):
    205                 raise ValueError("Location based indexing can only have "
    206                                  "[{types}] types"
    207                                  .format(types=self._valid_types))

IndexingError: Too many indexers
___________________________________________________________________________

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/bin/gpseqc_estimate", line 504, in <module>
    est = centrality.bin_estimate_parallel(comb, toCalc, args.threads)
  File "/usr/local/lib/python3.5/dist-packages/gpseqc/centrality.py", line 213, in bin_estimate_parallel
    for i in list(set(df.index)))
  File "/usr/local/lib/python3.5/dist-packages/joblib/parallel.py", line 789, in __call__
    self.retrieve()
  File "/usr/local/lib/python3.5/dist-packages/joblib/parallel.py", line 740, in retrieve
    raise exception
joblib.my_exceptions.JoblibIndexingError: JoblibIndexingError
___________________________________________________________________________
Multiprocessing exception:
...........................................................................
/usr/local/bin/gpseqc_estimate in <module>()
    499 
    500 # Estimate centrality of each bin
    501 if 1 == args.threads:
    502 	est = centrality.bin_estimate(comb, toCalc)
    503 else:
--> 504 	est = centrality.bin_estimate_parallel(comb, toCalc, args.threads)
    505 df_saveas(est, "estimated.%s.tsv" % descr, args)
    506 
    507 # (10) Rank bins ---------------------------------------------------------------
    508 print("Ranking bins...")

...........................................................................
/usr/local/lib/python3.5/dist-packages/gpseqc/centrality.py in bin_estimate_parallel(df=     chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], mlist=['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f'], threads=7, progress=True)
    208     verbose = 10 if progress else 0
    209 
    210     # Iterate over bins
    211     odf =  Parallel(n_jobs = threads, verbose = verbose)(
    212         delayed(bin_estimate_single)(i, df, mlist)
--> 213         for i in list(set(df.index)))
        df.index = Int64Index([ 370, 2325, 2323, 2328, 2428, 2425, ...9,  142],
           dtype='int64', length=11504)
    214 
    215     # Assemble output
    216     odf = pd.concat(odf, axis = 1).transpose()
    217     columns = ['chrom', 'start', 'end']

...........................................................................
/usr/local/lib/python3.5/dist-packages/joblib/parallel.py in __call__(self=Parallel(n_jobs=7), iterable=<generator object bin_estimate_parallel.<locals>.<genexpr>>)
    784             if pre_dispatch == "all" or n_jobs == 1:
    785                 # The iterable was consumed all at once by the above for loop.
    786                 # No need to wait for async callbacks to trigger to
    787                 # consumption.
    788                 self._iterating = False
--> 789             self.retrieve()
        self.retrieve = <bound method Parallel.retrieve of Parallel(n_jobs=7)>
    790             # Make sure that we get a last message telling us we are done
    791             elapsed_time = time.time() - self._start_time
    792             self._print('Done %3i out of %3i | elapsed: %s finished',
    793                         (len(self._output), len(self._output),

---------------------------------------------------------------------------
Sub-process traceback:
---------------------------------------------------------------------------
IndexingError                                      Tue Apr 24 13:48:52 2018
PID: 29536                                   Python 3.5.2: /usr/bin/python3
...........................................................................
/usr/local/lib/python3.5/dist-packages/joblib/parallel.py in __call__(self=<joblib.parallel.BatchedCalls object>)
    126     def __init__(self, iterator_slice):
    127         self.items = list(iterator_slice)
    128         self._size = len(self.items)
    129 
    130     def __call__(self):
--> 131         return [func(*args, **kwargs) for func, args, kwargs in self.items]
        self.items = [(<function bin_estimate_single>, (2870,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {}), (<function bin_estimate_single>, (2871,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {}), (<function bin_estimate_single>, (2872,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {}), (<function bin_estimate_single>, (2873,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {}), (<function bin_estimate_single>, (2874,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {}), (<function bin_estimate_single>, (2875,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {}), (<function bin_estimate_single>, (2876,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {}), (<function bin_estimate_single>, (2877,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f']), {})]
    132 
    133     def __len__(self):
    134         return self._size
    135 

...........................................................................
/usr/local/lib/python3.5/dist-packages/joblib/parallel.py in <listcomp>(.0=<list_iterator object>)
    126     def __init__(self, iterator_slice):
    127         self.items = list(iterator_slice)
    128         self._size = len(self.items)
    129 
    130     def __call__(self):
--> 131         return [func(*args, **kwargs) for func, args, kwargs in self.items]
        func = <function bin_estimate_single>
        args = (2877,      chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], ['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f'])
        kwargs = {}
    132 
    133     def __len__(self):
    134         return self._size
    135 

...........................................................................
/usr/local/lib/python3.5/dist-packages/gpseqc/centrality.py in bin_estimate_single(i=2877, df=     chrom      start        end      sum       ...2        735658     4  

[11504 rows x 9 columns], mlist=['prob_2p', 'prob_f', 'prob_g', 'cor_2p', 'cor_f', 'cor_g', 'roc_2p', 'roc_f', 'roc_g', 'var_2p', 'var_f', 'ff_2p', 'ff_f', 'cv_2p', 'cv_f'])
    238 
    239     # Calculate requested metrics
    240     for m in mlist:
    241         # Probability
    242         if m == "prob_2p": # two-points
--> 243             orow[m] = est_2p(st, calc_p, lambda x, y: x / y)
        orow = 'chr6'
        m = 'prob_2p'
        st = 0        chr6
1    87000000
2    88000000
3     ...   596776
8           3
Name: 2877, dtype: object
    244         elif m == "prob_f": # fixed
    245             orow[m] = est_f(st, calc_p, lambda x, y: x / y)
    246         elif m == "prob_g": # global
    247             orow[m] = est_g(st, calc_p, lambda x, y: x / y)

...........................................................................
/usr/local/lib/python3.5/dist-packages/gpseqc/centrality.py in est_2p(st=0        chr6
1    87000000
2    88000000
3     ...   596776
8           3
Name: 2877, dtype: object, f1=<function calc_p>, f2=<function bin_estimate_single.<locals>.<lambda>>)
    118         f2 (fun): function for putting conditions together.
    119 
    120     Returns:
    121         Estimated centrality.
    122     '''
--> 123     a = f1(st, 0)
        a = undefined
        f1 = <function calc_p>
        st = 0        chr6
1    87000000
2    88000000
3     ...   596776
8           3
Name: 2877, dtype: object
    124     b = f1(st, st.shape[0] - 1)
    125     return(f2(b, a))
    126 
    127 def est_f(st, f1, f2):

...........................................................................
/usr/local/lib/python3.5/dist-packages/gpseqc/centrality.py in calc_p(st=0        chr6
1    87000000
2    88000000
3     ...   596776
8           3
Name: 2877, dtype: object, ci=0)
     26 
     27     Returns:
     28         float
     29     '''
     30     assert ci < st.shape[0], "requested condition (index) not found."
---> 31     row = st.iloc[ci, :]
        row = undefined
        st.iloc = <pandas.core.indexing._iLocIndexer object>
        ci = 0
     32     p = (row['cond_nreads'] * row['count'])
     33     p = row['sum'] / p if 0 != p else np.nan
     34     return(p)
     35 

...........................................................................
/usr/local/lib/python3.5/dist-packages/pandas/core/indexing.py in __getitem__(self=<pandas.core.indexing._iLocIndexer object>, key=(0, slice(None, None, None)))
   1362             try:
   1363                 if self._is_scalar_access(key):
   1364                     return self._getitem_scalar(key)
   1365             except (KeyError, IndexError):
   1366                 pass
-> 1367             return self._getitem_tuple(key)
        self._getitem_tuple = <bound method _iLocIndexer._getitem_tuple of <pandas.core.indexing._iLocIndexer object>>
        key = (0, slice(None, None, None))
   1368         else:
   1369             # we by definition only have the 0th axis
   1370             axis = self.axis or 0
   1371 

...........................................................................
/usr/local/lib/python3.5/dist-packages/pandas/core/indexing.py in _getitem_tuple(self=<pandas.core.indexing._iLocIndexer object>, tup=(0, slice(None, None, None)))
   1732 
   1733         return True
   1734 
   1735     def _getitem_tuple(self, tup):
   1736 
-> 1737         self._has_valid_tuple(tup)
        self._has_valid_tuple = <bound method _NDFrameIndexer._has_valid_tuple of <pandas.core.indexing._iLocIndexer object>>
        tup = (0, slice(None, None, None))
   1738         try:
   1739             return self._getitem_lowerdim(tup)
   1740         except:
   1741             pass

...........................................................................
/usr/local/lib/python3.5/dist-packages/pandas/core/indexing.py in _has_valid_tuple(self=<pandas.core.indexing._iLocIndexer object>, key=(0, slice(None, None, None)))
    198 
    199     def _has_valid_tuple(self, key):
    200         """ check the key for valid keys across my indexer """
    201         for i, k in enumerate(key):
    202             if i >= self.obj.ndim:
--> 203                 raise IndexingError('Too many indexers')
    204             if not self._has_valid_type(k, i):
    205                 raise ValueError("Location based indexing can only have "
    206                                  "[{types}] types"
    207                                  .format(types=self._valid_types))

IndexingError: Too many indexers

Additional information

gpseqc tmp file not found

Before submitting an issue, please be sure to

This issue affects

  • Centrality estimation (gpseqc_estimate)
  • Ranks comparison (gpseqc_compare)

What did you do (e.g., steps to reproduce)

I ran multiple and parallelized simultaneous instances of gpseqc_estimate.

What did you expect to happen?

To be able to run all of those instances.

What happened instead?

Some instances crashed due to missing tmp file(s).

Additional information

Need to manage tmp files in a better way. Suggestion: creating temporary subfolder were to include everything? Also, using hashed number of seconds since UNIX Epoch to avoid overlaps?

KTw rank sorting assert error

Before submitting an issue, please be sure to

This issue affects

  • Centrality estimation (gpseqc_estimate)
  • Ranks comparison (gpseqc_compare)

What did you do (e.g., steps to reproduce)

/usr/local/bin/gpseqc_compare /home/bicro/Desktop/Backup/user_folders_HD2/Tomek/BICRO60/Data_output/bed_files/new/10Mb/estimated.bins.size10000000.step10000000.group10000.csm3.tsv /home/bicro/Desktop/Backup/user_folders_HD2/Tomek/BICRO62/Data_output/bed_files/new/10Mb/estimated.bins.size10000000.step10000000.group10000.csm3.tsv -o /home/bicro/Desktop/Backup/user_folders_HD2/Tomek/BICRO62/Data_output/bed_files/new/10Mb/rankcmp -t 36 -p B62_vs_B60_10Mb
@2018-05-17 14:31:50.498398

Python 3.6.5 (default, Apr  1 2018, 05:46:30) 
[GCC 7.3.0]

 # GPSeqC - Ranking comparison

 1st rank : /home/bicro/Desktop/Backup/user_folders_HD2/Tomek/BICRO60/Data_output/bed_files/new/10Mb/estimated.bins.size10000000.step10000000.group10000.csm3.tsv
 2nd rank : /home/bicro/Desktop/Backup/user_folders_HD2/Tomek/BICRO62/Data_output/bed_files/new/10Mb/estimated.bins.size10000000.step10000000.group10000.csm3.tsv

 Distance : ktw
    nIter : 5000
  Threads : 36

    Delim : '	'
   Prefix : 'B62_vs_B60_10Mb.'

What did you expect to happen?

What happened instead?

Assert error saying that the first ranking is not sorted.

Additional information

Empty bed file triggers issue.

Before submitting an issue, please be sure to

This issue affects

  • Centrality estimation (gpseqc_estimate)
  • Ranks comparison (gpseqc_compare)

What did you do (e.g., steps to reproduce)

gpseqc_estimate '/home/bicro/Desktop/user_folders_HD2/Quim/1min_exclusion/1Mb/BICRO55/input/TK94_5min_GG__cutsiteLoc-umiCount.bed' '/home/bicro/Desktop/user_folders_HD2/Quim/1min_exclusion/1Mb/BICRO55/input/TK95_10min_GG__cutsiteLoc-umiCount.bed' '/home/bicro/Desktop/user_folders_HD2/Quim/1min_exclusion/1Mb/BICRO55/input/TK96_15min_GG__cutsiteLoc-umiCount.bed' '/home/bicro/Desktop/user_folders_HD2/Quim/1min_exclusion/1Mb/BICRO55/input/TK97_30min_GG__cutsiteLoc-umiCount.bed' '/home/bicro/Desktop/user_folders_HD2/Quim/1min_exclusion/1Mb/BICRO55/input/TK98_on_GG__cutsiteLoc-umiCount.bed' -o '/home/bicro/Desktop/user_folders_HD2/Quim/1min_exclusion/1Mb/BICRO55/output' -s 1000000 -r BICRO55_excluding_1min_1MB_allMetrics -t 7

What did you expect to happen?

work

What happened instead?

error

Additional information

    Threads : 7
 Output dir : /home/bicro/Desktop/user_folders_HD2/Quim/1min_exclusion/1Mb/BICRO55/output
  Bed files : 
   (1) /home/bicro/Desktop/user_folders_HD2/Quim/1min_exclusion/1Mb/BICRO55/input/TK94_5min_GG__cutsiteLoc-umiCount.bed
   (2) /home/bicro/Desktop/user_folders_HD2/Quim/1min_exclusion/1Mb/BICRO55/input/TK95_10min_GG__cutsiteLoc-umiCount.bed
   (3) /home/bicro/Desktop/user_folders_HD2/Quim/1min_exclusion/1Mb/BICRO55/input/TK96_15min_GG__cutsiteLoc-umiCount.bed
   (4) /home/bicro/Desktop/user_folders_HD2/Quim/1min_exclusion/1Mb/BICRO55/input/TK97_30min_GG__cutsiteLoc-umiCount.bed
   (5) /home/bicro/Desktop/user_folders_HD2/Quim/1min_exclusion/1Mb/BICRO55/input/TK98_on_GG__cutsiteLoc-umiCount.bed

Confirm settings and proceed? (y/n)
y

Parsing bedfiles and counting reads...
Identifying chromosomes...
100%|โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 5/5 [00:02<00:00,  2.14it/s]
Generating bins...
Preparing cutsites...
Removing empty sites...
Assigning to bins...
[Parallel(n_jobs=7)]: Done   1 tasks      | elapsed:   18.3s
[Parallel(n_jobs=7)]: Done   2 out of   5 | elapsed:   24.1s remaining:   36.2s
[Parallel(n_jobs=7)]: Done   3 out of   5 | elapsed:   27.1s remaining:   18.1s
multiprocessing.pool.RemoteTraceback: 
"""
Traceback (most recent call last):
  File "/usr/local/lib/python3.5/dist-packages/joblib/_parallel_backends.py", line 350, in __call__
    return self.func(*args, **kwargs)
  File "/usr/local/lib/python3.5/dist-packages/joblib/parallel.py", line 131, in __call__
    return [func(*args, **kwargs) for func, args, kwargs in self.items]
  File "/usr/local/lib/python3.5/dist-packages/joblib/parallel.py", line 131, in <listcomp>
    return [func(*args, **kwargs) for func, args, kwargs in self.items]
  File "/usr/local/bin/gpseqc_estimate", line 453, in do_assign
    bedfiles[i] = bed.to_bins(bins, bedfiles[i])
  File "/usr/local/lib/python3.5/dist-packages/gpseqc/bed.py", line 164, in to_bins
    assert bed.field_count() >= 5, assert_msg
AssertionError: missing score column, run with 'noValues = True'.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.5/multiprocessing/pool.py", line 119, in worker
    result = (True, func(*args, **kwds))
  File "/usr/local/lib/python3.5/dist-packages/joblib/_parallel_backends.py", line 359, in __call__
    raise TransportableException(text, e_type)
joblib.my_exceptions.TransportableException: TransportableException
___________________________________________________________________________
AssertionError                                     Tue Apr 24 13:18:21 2018
PID: 28363                                   Python 3.5.2: /usr/bin/python3
...........................................................................
/usr/local/lib/python3.5/dist-packages/joblib/parallel.py in __call__(self=<joblib.parallel.BatchedCalls object>)
    126     def __init__(self, iterator_slice):
    127         self.items = list(iterator_slice)
    128         self._size = len(self.items)
    129 
    130     def __call__(self):
--> 131         return [func(*args, **kwargs) for func, args, kwargs in self.items]
        self.items = [(<function do_assign>, (2, [<BedTool(/tmp/pybedtools.fgszti2x.tmp)>, <BedTool(/tmp/pybedtools.jq2ugpxj.tmp)>, <BedTool(/tmp/pybedtools.4o5myg6a.tmp)>, <BedTool(/tmp/pybedtools.30h383xh.tmp)>, <BedTool(/tmp/pybedtools.m4ufhh7z.tmp)>], <BedTool(/tmp/pybedtools.4k4r_8p9.tmp)>, 'bins.size1000000.step1000000.csm3', Namespace(T='/tmp', bedfile=['/home/bicro/Deskto...ding_1min_1MB_allMetrics.', suffix='', threads=7)), {})]
    132 
    133     def __len__(self):
    134         return self._size
    135 

...........................................................................
/usr/local/lib/python3.5/dist-packages/joblib/parallel.py in <listcomp>(.0=<list_iterator object>)
    126     def __init__(self, iterator_slice):
    127         self.items = list(iterator_slice)
    128         self._size = len(self.items)
    129 
    130     def __call__(self):
--> 131         return [func(*args, **kwargs) for func, args, kwargs in self.items]
        func = <function do_assign>
        args = (2, [<BedTool(/tmp/pybedtools.fgszti2x.tmp)>, <BedTool(/tmp/pybedtools.jq2ugpxj.tmp)>, <BedTool(/tmp/pybedtools.4o5myg6a.tmp)>, <BedTool(/tmp/pybedtools.30h383xh.tmp)>, <BedTool(/tmp/pybedtools.m4ufhh7z.tmp)>], <BedTool(/tmp/pybedtools.4k4r_8p9.tmp)>, 'bins.size1000000.step1000000.csm3', Namespace(T='/tmp', bedfile=['/home/bicro/Deskto...ding_1min_1MB_allMetrics.', suffix='', threads=7))
        kwargs = {}
    132 
    133     def __len__(self):
    134         return self._size
    135 

...........................................................................
/usr/local/bin/gpseqc_estimate in do_assign(i=2, bedfiles=[<BedTool(/tmp/pybedtools.fgszti2x.tmp)>, <BedTool(/tmp/pybedtools.jq2ugpxj.tmp)>, <BedTool(/tmp/pybedtools.4o5myg6a.tmp)>, <BedTool(/tmp/pybedtools.30h383xh.tmp)>, <BedTool(/tmp/pybedtools.m4ufhh7z.tmp)>], bins=<BedTool(/tmp/pybedtools.4k4r_8p9.tmp)>, descr='bins.size1000000.step1000000.csm3', args=Namespace(T='/tmp', bedfile=['/home/bicro/Deskto...ding_1min_1MB_allMetrics.', suffix='', threads=7))
    448 
    449 # (6) Assign reads to bins (intersect) -----------------------------------------
    450 print("Assigning to bins...")
    451 
    452 def do_assign(i, bedfiles, bins, descr, args):
--> 453 	bedfiles[i] = bed.to_bins(bins, bedfiles[i])
        bedfiles = [<BedTool(/tmp/pybedtools.fgszti2x.tmp)>, <BedTool(/tmp/pybedtools.jq2ugpxj.tmp)>, <BedTool(/tmp/pybedtools.4o5myg6a.tmp)>, <BedTool(/tmp/pybedtools.30h383xh.tmp)>, <BedTool(/tmp/pybedtools.m4ufhh7z.tmp)>]
        i = 2
        bins = <BedTool(/tmp/pybedtools.4k4r_8p9.tmp)>
    454 
    455 	# Save if debugging
    456 	if args.debug_mode: bed_saveas(bedfiles[i], "intersected.%s.%s.tsv" % (
    457 		descr, os.path.basename(args.bedfile[i])), args)

...........................................................................
/usr/local/lib/python3.5/dist-packages/gpseqc/bed.py in to_bins(bins=<BedTool(/tmp/pybedtools.4k4r_8p9.tmp)>, bed=<BedTool(/tmp/pybedtools.4o5myg6a.tmp)>, noValues=False, skipEmpty=True)
    159         pbt.BedTool: grouped bed.
    160     '''
    161 
    162     if not noValues:
    163         assert_msg = "missing score column, run with 'noValues = True'."
--> 164         assert bed.field_count() >= 5, assert_msg
        bed.field_count = <bound method BedTool.field_count of <BedTool(/tmp/pybedtools.4o5myg6a.tmp)>>
        assert_msg = "missing score column, run with 'noValues = True'."
    165         bed = bed.cut(range(5)).sort() # Force to BED5
    166 
    167     # Enforce bins to BED3
    168     bins = bins.cut(range(3)).sort()

AssertionError: missing score column, run with 'noValues = True'.
___________________________________________________________________________
"""

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/dist-packages/joblib/parallel.py", line 699, in retrieve
    self._output.extend(job.get(timeout=self.timeout))
  File "/usr/lib/python3.5/multiprocessing/pool.py", line 608, in get
    raise self._value
joblib.my_exceptions.TransportableException: TransportableException
___________________________________________________________________________
AssertionError                                     Tue Apr 24 13:18:21 2018
PID: 28363                                   Python 3.5.2: /usr/bin/python3
...........................................................................
/usr/local/lib/python3.5/dist-packages/joblib/parallel.py in __call__(self=<joblib.parallel.BatchedCalls object>)
    126     def __init__(self, iterator_slice):
    127         self.items = list(iterator_slice)
    128         self._size = len(self.items)
    129 
    130     def __call__(self):
--> 131         return [func(*args, **kwargs) for func, args, kwargs in self.items]
        self.items = [(<function do_assign>, (2, [<BedTool(/tmp/pybedtools.fgszti2x.tmp)>, <BedTool(/tmp/pybedtools.jq2ugpxj.tmp)>, <BedTool(/tmp/pybedtools.4o5myg6a.tmp)>, <BedTool(/tmp/pybedtools.30h383xh.tmp)>, <BedTool(/tmp/pybedtools.m4ufhh7z.tmp)>], <BedTool(/tmp/pybedtools.4k4r_8p9.tmp)>, 'bins.size1000000.step1000000.csm3', Namespace(T='/tmp', bedfile=['/home/bicro/Deskto...ding_1min_1MB_allMetrics.', suffix='', threads=7)), {})]
    132 
    133     def __len__(self):
    134         return self._size
    135 

...........................................................................
/usr/local/lib/python3.5/dist-packages/joblib/parallel.py in <listcomp>(.0=<list_iterator object>)
    126     def __init__(self, iterator_slice):
    127         self.items = list(iterator_slice)
    128         self._size = len(self.items)
    129 
    130     def __call__(self):
--> 131         return [func(*args, **kwargs) for func, args, kwargs in self.items]
        func = <function do_assign>
        args = (2, [<BedTool(/tmp/pybedtools.fgszti2x.tmp)>, <BedTool(/tmp/pybedtools.jq2ugpxj.tmp)>, <BedTool(/tmp/pybedtools.4o5myg6a.tmp)>, <BedTool(/tmp/pybedtools.30h383xh.tmp)>, <BedTool(/tmp/pybedtools.m4ufhh7z.tmp)>], <BedTool(/tmp/pybedtools.4k4r_8p9.tmp)>, 'bins.size1000000.step1000000.csm3', Namespace(T='/tmp', bedfile=['/home/bicro/Deskto...ding_1min_1MB_allMetrics.', suffix='', threads=7))
        kwargs = {}
    132 
    133     def __len__(self):
    134         return self._size
    135 

...........................................................................
/usr/local/bin/gpseqc_estimate in do_assign(i=2, bedfiles=[<BedTool(/tmp/pybedtools.fgszti2x.tmp)>, <BedTool(/tmp/pybedtools.jq2ugpxj.tmp)>, <BedTool(/tmp/pybedtools.4o5myg6a.tmp)>, <BedTool(/tmp/pybedtools.30h383xh.tmp)>, <BedTool(/tmp/pybedtools.m4ufhh7z.tmp)>], bins=<BedTool(/tmp/pybedtools.4k4r_8p9.tmp)>, descr='bins.size1000000.step1000000.csm3', args=Namespace(T='/tmp', bedfile=['/home/bicro/Deskto...ding_1min_1MB_allMetrics.', suffix='', threads=7))
    448 
    449 # (6) Assign reads to bins (intersect) -----------------------------------------
    450 print("Assigning to bins...")
    451 
    452 def do_assign(i, bedfiles, bins, descr, args):
--> 453 	bedfiles[i] = bed.to_bins(bins, bedfiles[i])
        bedfiles = [<BedTool(/tmp/pybedtools.fgszti2x.tmp)>, <BedTool(/tmp/pybedtools.jq2ugpxj.tmp)>, <BedTool(/tmp/pybedtools.4o5myg6a.tmp)>, <BedTool(/tmp/pybedtools.30h383xh.tmp)>, <BedTool(/tmp/pybedtools.m4ufhh7z.tmp)>]
        i = 2
        bins = <BedTool(/tmp/pybedtools.4k4r_8p9.tmp)>
    454 
    455 	# Save if debugging
    456 	if args.debug_mode: bed_saveas(bedfiles[i], "intersected.%s.%s.tsv" % (
    457 		descr, os.path.basename(args.bedfile[i])), args)

...........................................................................
/usr/local/lib/python3.5/dist-packages/gpseqc/bed.py in to_bins(bins=<BedTool(/tmp/pybedtools.4k4r_8p9.tmp)>, bed=<BedTool(/tmp/pybedtools.4o5myg6a.tmp)>, noValues=False, skipEmpty=True)
    159         pbt.BedTool: grouped bed.
    160     '''
    161 
    162     if not noValues:
    163         assert_msg = "missing score column, run with 'noValues = True'."
--> 164         assert bed.field_count() >= 5, assert_msg
        bed.field_count = <bound method BedTool.field_count of <BedTool(/tmp/pybedtools.4o5myg6a.tmp)>>
        assert_msg = "missing score column, run with 'noValues = True'."
    165         bed = bed.cut(range(5)).sort() # Force to BED5
    166 
    167     # Enforce bins to BED3
    168     bins = bins.cut(range(3)).sort()

AssertionError: missing score column, run with 'noValues = True'.
___________________________________________________________________________

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/bin/gpseqc_estimate", line 466, in <module>
    for i in range(len(bedfiles)))
  File "/usr/local/lib/python3.5/dist-packages/joblib/parallel.py", line 789, in __call__
    self.retrieve()
  File "/usr/local/lib/python3.5/dist-packages/joblib/parallel.py", line 740, in retrieve
    raise exception
joblib.my_exceptions.JoblibAssertionError: JoblibAssertionError
___________________________________________________________________________
Multiprocessing exception:
...........................................................................
/usr/local/bin/gpseqc_estimate in <module>()
    461 	for i in tqdm(range(len(bedfiles))):
    462 		do_assign(i, bedfiles, bins, descr, args)
    463 else:
    464 	bedfiles = Parallel(n_jobs = args.threads, verbose = 11)(
    465 		delayed(do_assign)(i, bedfiles, bins, descr, args)
--> 466 		for i in range(len(bedfiles)))
    467 
    468 # (7) Calculate bin statistics -------------------------------------------------
    469 print("Calculating bin statistics...")
    470 

...........................................................................
/usr/local/lib/python3.5/dist-packages/joblib/parallel.py in __call__(self=Parallel(n_jobs=7), iterable=<generator object <genexpr>>)
    784             if pre_dispatch == "all" or n_jobs == 1:
    785                 # The iterable was consumed all at once by the above for loop.
    786                 # No need to wait for async callbacks to trigger to
    787                 # consumption.
    788                 self._iterating = False
--> 789             self.retrieve()
        self.retrieve = <bound method Parallel.retrieve of Parallel(n_jobs=7)>
    790             # Make sure that we get a last message telling us we are done
    791             elapsed_time = time.time() - self._start_time
    792             self._print('Done %3i out of %3i | elapsed: %s finished',
    793                         (len(self._output), len(self._output),

---------------------------------------------------------------------------
Sub-process traceback:
---------------------------------------------------------------------------
AssertionError                                     Tue Apr 24 13:18:21 2018
PID: 28363                                   Python 3.5.2: /usr/bin/python3
...........................................................................
/usr/local/lib/python3.5/dist-packages/joblib/parallel.py in __call__(self=<joblib.parallel.BatchedCalls object>)
    126     def __init__(self, iterator_slice):
    127         self.items = list(iterator_slice)
    128         self._size = len(self.items)
    129 
    130     def __call__(self):
--> 131         return [func(*args, **kwargs) for func, args, kwargs in self.items]
        self.items = [(<function do_assign>, (2, [<BedTool(/tmp/pybedtools.fgszti2x.tmp)>, <BedTool(/tmp/pybedtools.jq2ugpxj.tmp)>, <BedTool(/tmp/pybedtools.4o5myg6a.tmp)>, <BedTool(/tmp/pybedtools.30h383xh.tmp)>, <BedTool(/tmp/pybedtools.m4ufhh7z.tmp)>], <BedTool(/tmp/pybedtools.4k4r_8p9.tmp)>, 'bins.size1000000.step1000000.csm3', Namespace(T='/tmp', bedfile=['/home/bicro/Deskto...ding_1min_1MB_allMetrics.', suffix='', threads=7)), {})]
    132 
    133     def __len__(self):
    134         return self._size
    135 

...........................................................................
/usr/local/lib/python3.5/dist-packages/joblib/parallel.py in <listcomp>(.0=<list_iterator object>)
    126     def __init__(self, iterator_slice):
    127         self.items = list(iterator_slice)
    128         self._size = len(self.items)
    129 
    130     def __call__(self):
--> 131         return [func(*args, **kwargs) for func, args, kwargs in self.items]
        func = <function do_assign>
        args = (2, [<BedTool(/tmp/pybedtools.fgszti2x.tmp)>, <BedTool(/tmp/pybedtools.jq2ugpxj.tmp)>, <BedTool(/tmp/pybedtools.4o5myg6a.tmp)>, <BedTool(/tmp/pybedtools.30h383xh.tmp)>, <BedTool(/tmp/pybedtools.m4ufhh7z.tmp)>], <BedTool(/tmp/pybedtools.4k4r_8p9.tmp)>, 'bins.size1000000.step1000000.csm3', Namespace(T='/tmp', bedfile=['/home/bicro/Deskto...ding_1min_1MB_allMetrics.', suffix='', threads=7))
        kwargs = {}
    132 
    133     def __len__(self):
    134         return self._size
    135 

...........................................................................
/usr/local/bin/gpseqc_estimate in do_assign(i=2, bedfiles=[<BedTool(/tmp/pybedtools.fgszti2x.tmp)>, <BedTool(/tmp/pybedtools.jq2ugpxj.tmp)>, <BedTool(/tmp/pybedtools.4o5myg6a.tmp)>, <BedTool(/tmp/pybedtools.30h383xh.tmp)>, <BedTool(/tmp/pybedtools.m4ufhh7z.tmp)>], bins=<BedTool(/tmp/pybedtools.4k4r_8p9.tmp)>, descr='bins.size1000000.step1000000.csm3', args=Namespace(T='/tmp', bedfile=['/home/bicro/Deskto...ding_1min_1MB_allMetrics.', suffix='', threads=7))
    448 
    449 # (6) Assign reads to bins (intersect) -----------------------------------------
    450 print("Assigning to bins...")
    451 
    452 def do_assign(i, bedfiles, bins, descr, args):
--> 453 	bedfiles[i] = bed.to_bins(bins, bedfiles[i])
        bedfiles = [<BedTool(/tmp/pybedtools.fgszti2x.tmp)>, <BedTool(/tmp/pybedtools.jq2ugpxj.tmp)>, <BedTool(/tmp/pybedtools.4o5myg6a.tmp)>, <BedTool(/tmp/pybedtools.30h383xh.tmp)>, <BedTool(/tmp/pybedtools.m4ufhh7z.tmp)>]
        i = 2
        bins = <BedTool(/tmp/pybedtools.4k4r_8p9.tmp)>
    454 
    455 	# Save if debugging
    456 	if args.debug_mode: bed_saveas(bedfiles[i], "intersected.%s.%s.tsv" % (
    457 		descr, os.path.basename(args.bedfile[i])), args)

...........................................................................
/usr/local/lib/python3.5/dist-packages/gpseqc/bed.py in to_bins(bins=<BedTool(/tmp/pybedtools.4k4r_8p9.tmp)>, bed=<BedTool(/tmp/pybedtools.4o5myg6a.tmp)>, noValues=False, skipEmpty=True)
    159         pbt.BedTool: grouped bed.
    160     '''
    161 
    162     if not noValues:
    163         assert_msg = "missing score column, run with 'noValues = True'."
--> 164         assert bed.field_count() >= 5, assert_msg
        bed.field_count = <bound method BedTool.field_count of <BedTool(/tmp/pybedtools.4o5myg6a.tmp)>>
        assert_msg = "missing score column, run with 'noValues = True'."
    165         bed = bed.cut(range(5)).sort() # Force to BED5
    166 
    167     # Enforce bins to BED3
    168     bins = bins.cut(range(3)).sort()

AssertionError: missing score column, run with 'noValues = True'.

problem with running the microscopy pipeline

Before submitting an issue, please be sure to

This issue affects

- [ ] Centrality estimation (gpseqc_estimate)

What did you do (e.g., steps to reproduce)

I tried to run GPSeq microscopy pipeline.

What did you expect to happen?

To get beautiful YFISH plots as an output :)

What happened instead?

The pipeline failed by the segmentation step.

* Looking for nuclei *
 
 Current condition: "iTK226_070618_001"...
Traceback (most recent call last):
  File "/usr/local/bin/gpseq_anim", line 6, in <module>
    exec(compile(open(__file__).read(), __file__, 'exec'))
  File "/home/bicro/Desktop/ggcode/pygpseq/bin/gpseq_anim", line 388, in <module>
    gpi = gpi.run()
  File "/home/bicro/Desktop/ggcode/pygpseq/pygpseq/anim/main.py", line 716, in run
    self.run_segmentation(**kwargs)
  File "/home/bicro/Desktop/ggcode/pygpseq/pygpseq/anim/main.py", line 914, in run_segmentation
    [c.find_nuclei(**kwargs) for c in self.conds]
  File "/home/bicro/Desktop/ggcode/pygpseq/pygpseq/anim/condition.py", line 505, in find_nuclei
    for i in range(len(self.series)))
  File "/usr/local/lib/python2.7/dist-packages/joblib/parallel.py", line 789, in __call__
    self.retrieve()
  File "/usr/local/lib/python2.7/dist-packages/joblib/parallel.py", line 740, in retrieve
    raise exception
joblib.my_exceptions.JoblibTypeError: JoblibTypeError
___________________________________________________________________________
Multiprocessing exception:
...........................................................................
/usr/local/bin/gpseq_anim in <module>()
      1 #!/usr/bin/env python3
      2 # -*- coding: utf-8 -*-
      3 
      4 # ------------------------------------------------------------------------------
      5 # 
----> 6 # MIT License
      7 # 
      8 # Copyright (c) 2017 Gabriele Girelli
      9 # 
     10 # Permission is hereby granted, free of charge, to any person obtaining a copy

...........................................................................
/home/bicro/Desktop/ggcode/pygpseq/bin/gpseq_anim in <module>()
    383 	OH.write("@%s\n\n" % datetime.datetime.now())
    384 	OH.write("%s\n" % sys.version)
    385 	OH.write(settings_string)
    386 
    387 # Start the analysis
--> 388 gpi = gpi.run()
    389 
    390 # End --------------------------------------------------------------------------
    391 
    392 ################################################################################

...........................................................................
/home/bicro/Desktop/ggcode/pygpseq/pygpseq/anim/main.py in run(self=<pygpseq.anim.main.Main object>, **kwargs={'adaptive_neighbourhood': 101, 'an_type': 3, 'aspect': (300.0, 216.6, 216.6), 'basedir': '/home/bicro/Desktop/user_folders/Tomek/microscop...ne/IMR90_iTK226-241_080618/Data_input_iTK226_233/', 'calc_n_surface': False, 'cdescr': {}, 'compressed': False, 'correctCA': False, 'debugging': False, 'dfield': 'lamin_dnorm', ...})
    711                 self.printout('Unskipping segmentation...', 0)
    712                 self.unskip(2)
    713         
    714         if not self.is_skipped(2):
    715             # Run segmentation if not skipped
--> 716             self.run_segmentation(**kwargs)
        self.run_segmentation = <bound method Main.run_segmentation of <pygpseq.anim.main.Main object>>
        kwargs = {'adaptive_neighbourhood': 101, 'an_type': 3, 'aspect': (300.0, 216.6, 216.6), 'basedir': '/home/bicro/Desktop/user_folders/Tomek/microscop...ne/IMR90_iTK226-241_080618/Data_input_iTK226_233/', 'calc_n_surface': False, 'cdescr': {}, 'compressed': False, 'correctCA': False, 'debugging': False, 'dfield': 'lamin_dnorm', ...}
    717 
    718             # Dump
    719             fname = self.outdir + 'gpi.seg'+ kwargs['suffix'] +'.cpickle'
    720             f = open(fname, 'wb')

...........................................................................
/home/bicro/Desktop/ggcode/pygpseq/pygpseq/anim/main.py in run_segmentation(self=<pygpseq.anim.main.Main object>, **kwargs={'adaptive_neighbourhood': 101, 'an_type': 3, 'aspect': (300.0, 216.6, 216.6), 'basedir': '/home/bicro/Desktop/user_folders/Tomek/microscop...ne/IMR90_iTK226-241_080618/Data_input_iTK226_233/', 'calc_n_surface': False, 'cdescr': {}, 'compressed': False, 'correctCA': False, 'debugging': False, 'dfield': 'lamin_dnorm', ...})
    909         kwargs['an_type'] = self.an_type
    910 
    911         # Identify nuclei
    912         self.printout('* Looking for nuclei *', 0)
    913         self.printout('', 0)
--> 914         [c.find_nuclei(**kwargs) for c in self.conds]
        c.find_nuclei = <bound method Condition.find_nuclei of <pygpseq.anim.condition.Condition object>>
        kwargs = {'adaptive_neighbourhood': 101, 'an_type': 3, 'aspect': (300.0, 216.6, 216.6), 'basedir': '/home/bicro/Desktop/user_folders/Tomek/microscop...ne/IMR90_iTK226-241_080618/Data_input_iTK226_233/', 'calc_n_surface': False, 'cdescr': {}, 'compressed': False, 'correctCA': False, 'debugging': False, 'dfield': 'lamin_dnorm', ...}
        c = <pygpseq.anim.condition.Condition object>
        self.conds = [<pygpseq.anim.condition.Condition object>, <pygpseq.anim.condition.Condition object>, <pygpseq.anim.condition.Condition object>, <pygpseq.anim.condition.Condition object>, <pygpseq.anim.condition.Condition object>, <pygpseq.anim.condition.Condition object>, <pygpseq.anim.condition.Condition object>, <pygpseq.anim.condition.Condition object>]
    915         self.printout('', 0)
    916 
    917     def unskip(self, step):
    918         """Unskips a run step that was supposed to be skipped. """

...........................................................................
/home/bicro/Desktop/ggcode/pygpseq/pygpseq/anim/condition.py in find_nuclei(self=<pygpseq.anim.condition.Condition object>, **kwargs={'adaptive_neighbourhood': 101, 'an_type': 3, 'aspect': (300.0, 216.6, 216.6), 'basedir': '/home/bicro/Desktop/user_folders/Tomek/microscop...ne/IMR90_iTK226-241_080618/Data_input_iTK226_233/', 'calc_n_surface': False, 'cdescr': {}, 'compressed': False, 'cond_name': 'iTK226_070618_001', 'correctCA': False, 'debugging': False, ...})
    500 
    501         # Segment every series in the condition
    502         self.printout('Current condition: "' + self.name + '"...', 0)
    503         self.series = Parallel(n_jobs = ncores)(
    504             delayed(find_series_nuclei)(self, i, **kwargs)
--> 505             for i in range(len(self.series)))
        self.series = [<pygpseq.anim.series.Series object>, <pygpseq.anim.series.Series object>, <pygpseq.anim.series.Series object>]
    506 
    507     def get_nuclei(self):
    508         """Return a list of the nuclei in the condition. """
    509         nuclei = []

...........................................................................
/usr/local/lib/python2.7/dist-packages/joblib/parallel.py in __call__(self=Parallel(n_jobs=36), iterable=<generator object <genexpr>>)
    784             if pre_dispatch == "all" or n_jobs == 1:
    785                 # The iterable was consumed all at once by the above for loop.
    786                 # No need to wait for async callbacks to trigger to
    787                 # consumption.
    788                 self._iterating = False
--> 789             self.retrieve()
        self.retrieve = <bound method Parallel.retrieve of Parallel(n_jobs=36)>
    790             # Make sure that we get a last message telling us we are done
    791             elapsed_time = time.time() - self._start_time
    792             self._print('Done %3i out of %3i | elapsed: %s finished',
    793                         (len(self._output), len(self._output),

---------------------------------------------------------------------------
Sub-process traceback:
---------------------------------------------------------------------------
TypeError                                          Mon Jun 18 20:18:22 2018
PID: 223234                               Python 2.7.15rc1: /usr/bin/python
...........................................................................
/usr/local/lib/python2.7/dist-packages/joblib/parallel.py in __call__(self=<joblib.parallel.BatchedCalls object>)
    126     def __init__(self, iterator_slice):
    127         self.items = list(iterator_slice)
    128         self._size = len(self.items)
    129 
    130     def __call__(self):
--> 131         return [func(*args, **kwargs) for func, args, kwargs in self.items]
        func = <function find_series_nuclei>
        args = (<pygpseq.anim.condition.Condition object>, 0)
        kwargs = {'adaptive_neighbourhood': 101, 'an_type': 3, 'aspect': (300.0, 216.6, 216.6), 'basedir': '/home/bicro/Desktop/user_folders/Tomek/microscop...ne/IMR90_iTK226-241_080618/Data_input_iTK226_233/', 'calc_n_surface': False, 'cdescr': {}, 'compressed': False, 'cond_name': 'iTK226_070618_001', 'correctCA': False, 'debugging': False, ...}
        self.items = [(<function find_series_nuclei>, (<pygpseq.anim.condition.Condition object>, 0), {'adaptive_neighbourhood': 101, 'an_type': 3, 'aspect': (300.0, 216.6, 216.6), 'basedir': '/home/bicro/Desktop/user_folders/Tomek/microscop...ne/IMR90_iTK226-241_080618/Data_input_iTK226_233/', 'calc_n_surface': False, 'cdescr': {}, 'compressed': False, 'cond_name': 'iTK226_070618_001', 'correctCA': False, 'debugging': False, ...})]
    132 
    133     def __len__(self):
    134         return self._size
    135 

...........................................................................
/home/bicro/Desktop/ggcode/pygpseq/pygpseq/anim/condition.py in find_series_nuclei(self=<pygpseq.anim.condition.Condition object>, i=0, **kwargs={'adaptive_neighbourhood': 101, 'an_type': 3, 'aspect': (300.0, 216.6, 216.6), 'basedir': '/home/bicro/Desktop/user_folders/Tomek/microscop...ne/IMR90_iTK226-241_080618/Data_input_iTK226_233/', 'calc_n_surface': False, 'cdescr': {}, 'compressed': False, 'cond_name': 'iTK226_070618_001', 'correctCA': False, 'debugging': False, ...})
    743 
    744     # Get starting time
    745     start_time = time.time()
    746 
    747     # Find nuclei
--> 748     self.series[i], log = self.series[i].find_nuclei(**kwargs)
        self.series = [<pygpseq.anim.series.Series object>, <pygpseq.anim.series.Series object>, <pygpseq.anim.series.Series object>]
        i = 0
        log = undefined
        i.find_nuclei = undefined
        kwargs = {'adaptive_neighbourhood': 101, 'an_type': 3, 'aspect': (300.0, 216.6, 216.6), 'basedir': '/home/bicro/Desktop/user_folders/Tomek/microscop...ne/IMR90_iTK226-241_080618/Data_input_iTK226_233/', 'calc_n_surface': False, 'cdescr': {}, 'compressed': False, 'cond_name': 'iTK226_070618_001', 'correctCA': False, 'debugging': False, ...}
    749 
    750     # Print log all at once
    751     time_msg = 'Took %s s.\n' % (round(time.time() - start_time, 3))
    752     if not 1 == kwargs['ncores']:

...........................................................................
/home/bicro/Desktop/ggcode/pygpseq/pygpseq/anim/series.py in find_nuclei(self=<pygpseq.anim.series.Series object>, **kwargs={'adaptive_neighbourhood': 101, 'an_type': 3, 'aspect': (300.0, 216.6, 216.6), 'basedir': '/home/bicro/Desktop/user_folders/Tomek/microscop...1_080618/Data_input_iTK226_233/iTK226_070618_001/', 'calc_n_surface': False, 'cdescr': {}, 'compressed': False, 'cond_name': 'iTK226_070618_001', 'correctCA': False, 'debugging': False, ...})
    229 
    230         # Make new channel copy
    231         i = dna_ch.copy()
    232 
    233         # Produce a mask
--> 234         Segmenter = Binarize(path = kwargs['logpath'], append = True, **kwargs)
        Segmenter = undefined
        kwargs = {'adaptive_neighbourhood': 101, 'an_type': 3, 'aspect': (300.0, 216.6, 216.6), 'basedir': '/home/bicro/Desktop/user_folders/Tomek/microscop...1_080618/Data_input_iTK226_233/iTK226_070618_001/', 'calc_n_surface': False, 'cdescr': {}, 'compressed': False, 'cond_name': 'iTK226_070618_001', 'correctCA': False, 'debugging': False, ...}
    235         Segmenter.verbose = self.verbose
    236         
    237         # Check if already segmented
    238         already_segmented = False

...........................................................................
/home/bicro/Desktop/ggcode/pygpseq/pygpseq/tools/binarize.py in __init__(self=<pygpseq.tools.binarize.Binarize object>, **kwargs={'adaptive_neighbourhood': 101, 'an_type': 3, 'append': True, 'aspect': (300.0, 216.6, 216.6), 'basedir': '/home/bicro/Desktop/user_folders/Tomek/microscop...1_080618/Data_input_iTK226_233/iTK226_070618_001/', 'calc_n_surface': False, 'cdescr': {}, 'compressed': False, 'cond_name': 'iTK226_070618_001', 'correctCA': False, ...})
     60         """Initialize binarization settings all at once with kwargs.
     61 
     62         Args:
     63           **kwargs: arbitrary keyword arguments stored in the class.
     64         """
---> 65         super(Binarize, self).__init__()
        self.__init__ = <bound method Binarize.__init__ of <pygpseq.tools.binarize.Binarize object>>
     66 
     67         # Store provided kwargs in the current instance.
     68         excluded = ['logpath']
     69         for k in kwargs.keys():

...........................................................................
/home/bicro/Desktop/ggcode/pygpseq/pygpseq/tools/io.py in __init__(self=<pygpseq.tools.binarize.Binarize object>, **kwargs={})
     64                 fname, fext = os.path.splitext(curpath)
     65                 fname = fname + ' (' + str(c) + ')'
     66                 curpath = fname + fext
     67                 c += 1
     68 
---> 69         self.logpath = curpath
        self.logpath = ''
        curpath = '/tmp/pyGPSeq/log/2018-06-18_20:18:22.log'
     70     
     71     def check_log_dir(self, path = None):
     72         """Create log file directory if missing.
     73 

...........................................................................
/home/bicro/Desktop/ggcode/pygpseq/pygpseq/tools/binarize.py in __setattr__(self=<pygpseq.tools.binarize.Binarize object>, name='logpath', value='/tmp/pyGPSeq/log/2018-06-18_20:18:22.log')
     74         if key in dir(self): return(getattr(self, key))
     75         else: return(None)
     76 
     77     def __setattr__(self, name, value):
     78         """ Check the attribute and set it. """
---> 79         Binarize.check_attr(name, value)
        name = 'logpath'
        value = '/tmp/pyGPSeq/log/2018-06-18_20:18:22.log'
     80         return(super(Binarize, self).__setattr__(name, value))
     81 
     82     def __setitem__(self, key, value):
     83         """ Allow set item. """

TypeError: unbound method check_attr() must be called with Binarize instance as first argument (got str instance instead)

Additional information

My first action was to see if I am using the latest version of the pipeline. It happened to be already updated. Then, I took my previous data sets that I ran in the past and they failed at the same step. So the issue was not related to the images. I thought that maybe there was something fishy about them. Therefore, dear GG, maybe there is something silly from my side that I am doing wrong but at this point I do not know what could be the problem. Thanks! :)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.