GithubHelp home page GithubHelp logo

gerhobbelt / jison Goto Github PK

View Code? Open in Web Editor NEW

This project forked from zaach/jison

116.0 10.0 20.0 32.99 MB

bison / YACC / LEX in JavaScript (LALR(1), SLR(1), etc. lexer/parser generator)

Home Page: https://gerhobbelt.github.io/jison/

License: MIT License

Makefile 0.35% JavaScript 93.57% Ruby 0.01% HTML 0.37% CSS 0.04% Shell 0.14% Yacc 3.45% Lex 2.06%
lexer parser lalr lr lr1 slr slr1 lalr1 flex yacc bison lex parsing tokenizer

jison's Introduction

Jison

Join the chat at https://gitter.im/jison-parsers-lexers/Lobby Build Status NPM version Dependency Status npm Average time to resolve an issue Percentage of issues still open

Notice

This repository contains a fork maintained by GerHobbelt. The original JISON work has been done by Zachary Carter and is available in zaach/jison.

For an overview of all changes (fixes and features), see the section What's New or Different? further below. See also pullreq #338.

An API for creating parsers in JavaScript

Jison generates bottom-up parsers in JavaScript. Its API is similar to Bison's, hence the name. It supports many of Bison's major features, plus some of its own. If you are new to parser generators such as Bison, and Context-free Grammars in general, a good introduction is found in the Bison manual. If you already know Bison, Jison should be easy to pickup.

Briefly, Jison takes a JSON encoded grammar or Bison style grammar and outputs a JavaScript file capable of parsing the language described by that grammar. You can then use the generated script to parse inputs and accept, reject, or perform actions based on the input.

Installation

Jison can be installed for Node using npm

Using npm:

npm install jison-gho -g

Usage from the command line

Clone the github repository for examples:

git clone git://github.com/GerHobbelt/jison.git
cd jison/examples

Now you're ready to generate some parsers:

jison calculator.jison

This will generate calculator.js in your current working directory. This file can be used to parse an input file, like so:

echo "2^32 / 1024" > testcalc
node calculator.js testcalc

This will print out 4194304.

Full cli option list:

Usage: jison [file] [lexfile] [options]

file        file containing a grammar
lexfile     file containing a lexical grammar

Where the available options are:

: -j, --json force jison to expect a grammar in JSON format [false]

: -o FILE, --outfile FILE Filepath and base module name of the generated parser; when terminated with a / (dir separator) it is treated as the destination directory where the generated output will be stored

: -t, --debug Debug mode [false]

: -I, --info Report some statistics about the generated parser [false]

: -m TYPE, --module-type TYPE The type of module to generate (commonjs, amd, es, js) or an alias (cjs=commonjs, umd=amd and iffe=js) [commonjs]

: -n NAME, --module-name NAME The name of the generated parser object, namespace supported. This has no effect on amd/umd or es modules.

: -p TYPE, --parser-type TYPE The type of algorithm to use for the parser (lr0, slr, lalr, lr, ll) [lalr]

: -c, --compress-tables Output compressed parser tables in generated modules (0 = no compression, 1 = default compression, 2 = deep compression) [2]

: -T, --output-debug-tables Output extra parser tables (rules list + look-ahead analysis) in generated modules to assist debugging / diagnostics purposes [false]

: -X, --no-default-resolve Act another way when a conflict is found in the grammar [false]

: --default-action=[for-values,for-locations] Generate a parser which does NOT include the default "$$ = $1" action for every rule. This produces a slightly faster parser but now you are solely reponsible for propagating rule action "$$" results. [false]

: --no-try-catch Generate a parser which does NOT try/catch exceptions (from the grammar action code or parseError error reporting calls. This produces a slightly faster parser at the cost of enhanced code safety. [false]

: -Q, --error-recovery-token-discard-count Set the number of lexed tokens that may be gobbled by an error recovery process before we cry wolf (default: 3) [3]

: -E, --export-all-tables Next to producing a grammar source file, also export the symbols, terminals, grammar and parse tables to separate JSON files for further use by other tools. The files' names will be derived from the outputFile name by appending a suffix. [false]

: -x, --main Include .main() entry point in generated commonjs module [false]

: -y NAME, --module-main NAME The module exports NAME as exports.main (module type commonjs or cjs) or as yymain (module type es). This option has no effect with module type amd or umd. It only has an effect when used with -x, though it does not (contrary to possible expectations) rename the main function; it simply elides the creation of a main and exports NAME as a main.

: -V, --version print version and exit

Usage as a CommonJS module

You can generate parsers programmatically from JavaScript as well. Assuming Jison is in your CommonJS environment's load path:

// mygenerator.js
var Parser = require("jison").Parser;

// a grammar in JSON
var grammar = {
    "lex": {
        "rules": [
           ["\\s+", "/* skip whitespace */"],
           ["[a-f0-9]+", "return 'HEX';"]
        ]
    },

    "bnf": {
        "hex_strings" :[ "hex_strings HEX",
                         "HEX" ]
    }
};

// `grammar` can also be a string that uses jison's grammar format
var parser = new Parser(grammar);

// generate source, ready to be written to disk
var parserSource = parser.generate();

// you can also use the parser directly from memory

// returns true
parser.parse("adfe34bc e82a");

// throws lexical error
parser.parse("adfe34bc zxg");

Differences in module types

Jison allows you to emit these module types: (commonjs/cjs, amd/umd, es, js/iffe). In the following sections, <parser> represents the parser code common to all types of module.

cjs/commonjs

The parser is wrapped in:

var \<module-name\> = (function () {
  \<parser\>
  return new Parser();
})();
if (typeof require !== 'undefined' && typeof exports !== 'undefined') {
  exports.parser = \<module-name\>;
  exports.Parser = \<module-name\>.Parser;
  exports.parse = function () {
    return \<module-name\>.parse.apply(\<module-name\>, arguments);
  };
}

The --main function is declared with:

exports.main = function (args) {
  ...
}

amd/umd

The parser is wrapped with:

define(function (require) {
  \<parser\>
  return parser;
});

The --module-name NAME option has no effect if the type is amd or umd.

js/iffe

The parser is wrapped with:

var <module-name> = (function () {
  \<parser\>
  function Parser() {
    this.yy = {};
  }
  Parser.prototype = parser;
  parser.Parser = Parser;

  return new Parser();
})();

es

The parser is appended with:

\<parser\>
function yyparse() {
    return parser.parse.apply(parser, arguments);
}
export default {
    parser,
    Parser,
    parse: yyparse,
};

The --module-name NAME option has no effect if the type is es. The --main function is declared with:

var yymain = function (args) {
  ...
}

More Documentation

For more information on creating grammars and using the generated parsers, read the documentation.

How to contribute

See CONTRIBUTING.md for contribution guidelines, how to run the tests, etc.

Projects using Jison

View them on the wiki, or add your own.

Submodules for Jison

The JISON tool uses several modules:

  • The ebnf-parser library parses BNF and EBNF grammars to a basic AST used by Jison to produce a parser engine for your grammar spec.
  • The lex-parser library parses %lex ... /lex lexical grammars to a basic AST used by Jison to produce a parser engine for your grammar spec.
  • The jison-lex library/utility generates lexical analyzers which are included by Jison in your parser run-time engine to lex the input according to your %lex ... /lex lexical grammar definition.
  • The jison2json utility converts a Jison spec file to JSON format file.
  • The json2jison utility converts a JSON format file to a Jison spec file.

Contributors

Githubbers

Special thanks to Jarred Ligatti, Manuel E. BermΓΊdez

What's New or Different?

Here's a comprehensive list of features and fixes compared to the original:

  • Full Unicode support: the lexer can handle all Unicode regexes which are supported by the XRegExp library, with a few notes:

    • your own software does not need to include the XRegExp library: jison will produce standard JavaScript regex expressions for every lexer rule so that you can enjoy most Unicode features without the added burden of another library (XRegExp)

    • astral Unicode codepoints are not fully supported within regex character set expressions, unless you yourself include XRegExp and instruct the lexer to produce XRegExp regex expressions via the lexer option %options xregexp

  • EBNF LR/LALR/SLR/LR0 grammars are correctly rewritten to BNF grammars, allowing your action code blocks to access all elements of the grammar rule at hand. See also the wiki section about EBNF.

  • Parser engine optimization: jison analyzes not just your grammar, but also your action code and will strip any feature you don't use (such as location tracking via @element references and yylloc) from the parser kernel, which will benefit your parser run-time performance. The fastest parsers are obtained when you do not include error recovery (error tokens in your grammar), nor any lexer location tracking: this can potentially result in run-time execution cost reductions of over 70% (hence your parser executes more than 3 times as fast)!

  • generated grammar / lexer source files carry a full API and internals documentation in the code comments to help you to read and debug a grammar. For example, every grammar rule is printed above its action code so that stepping through the parser when debugging hard-to-find problems makes it quite obvious which rule the engine is currently 'reducing'.

  • Generated parsers and lexers are JavaScript strict mode compliant.

  • you can specify a totally custom lexer in the %lex ... /lex section of your grammar definition file if you like, i.e. you can define and use a lexer which is not regex ruleset based / generated by jison lex! This is particularly handy when you want to achieve maximum performance / absolute minimum parse and lexing overhead for your high-performance demand grammars.

  • lexer.reject() et al: the lexer comes with extra APIs to help you write more sophisticated lexers based on the lex/jison mechanism. The this.reject() call in your lexer rule action code will reject the current match and continue down the lexer rule set to find another match. Very handy when you do not use flex mode matching all the time, but want specific, local, control over when a lexer regex (a.k.a. lexer rule) actually is a correct match.

  • You can now enter epsilon as a token in your grammar rules, so no more hacks like /* epsilon */ comments for empty rules: you can type any of these:

    • %epsilon,
    • \u0190
    • \u025B
    • \u03B5
    • \u03F5

    (See also https://en.wikipedia.org/wiki/Epsilon#Glyph_variants)

  • %options easy_keyword_rules: see also https://github.com/zaach/jison/wiki/Deviations-From-Flex-Bison#user-content-literal-tokens

  • ... more lexer features ...

    • %options ...

    • kernel ...

  • ... more parser features ...

    • configurable error recovery search depth (default: 3 tokens)

    • augmented error reporting callbacks

    • dedicated parser and lexer Error-derived exception classes so you can use instanceof to help your generic error code discern what type of error has occurred and what info is available next to the text message itself.

    • (are we faster even when we run with the same feature set as 'vanilla' zaach jison? Probably a little bit, but haven't measured this thoroughly.)

    • JSON (rather than JISON) grammar files support all JSON5 features, i.e. you can include comments, etc. in your JSON-file based grammar specs!

License

Copyright (c) 2009-2016 Zachary Carter

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

jison's People

Contributors

acodercc avatar balupton avatar cdibbs avatar davidgovea avatar gerhobbelt avatar hyperfocusaurus avatar joseanpg avatar knuton avatar lovasoa avatar lukemueller avatar mathiasrw avatar matthewkastor avatar micha avatar mike-mcgann avatar nelsonjchen avatar nightra avatar nnydjesus avatar nolanlawson avatar patriciali avatar paulftw avatar petkaantonov avatar redchair123 avatar robertleeplummerjr avatar rpl avatar rubenverborgh avatar satyr avatar syrnick avatar techtonik avatar toufik-airane avatar zaach avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

jison's Issues

The lexer emitted by jison contains non-es5 code.

I took from #14 that jison should be producing es5 code. However, the lexer emitted by jison contains const, which is not a keyword recognized in es5. The code may misleadingly run just fine on browsers/JS engines that declare themselves to be es5 engines. The problem is that, historically, many engines declaring themselves to support es5, in fact support es5 and let and const though they are not part of es5 proper.

Steps to reproduce:

  1. Install the latest jison. (I used 0.6.1-215).

  2. Grab the basic_lex.jison example from the examples subdirectory.

  3. Run jison basic_lex.jison.

  4. Open the resulting basic_lex.js and search for const. You'll find some in the lexer_prettyPrintRange function.

Creating multiple, independent lexers

I’m using Jison (technically Jison Lex independently) to create a source code preprocessor: the lexer is the preprocessor. Something I need to do is expand macros. The expanded macros can contain more macros that need to be expanded, so I must do this recursively by creating multiple preprocessors. Because Jison Lex generates a lexer as an object (and not, say, a function that can be instantiated with new), I need to create multiple preprocessors in a hacky way in Node.js. First, I delete the lexer from the Node.js module cache, and then I require the lexer file again. (I’m doing this here.) If I didn’t do this, a child preprocessor handling an expanded macro would be the same object as the parent preprocessor that expanded the macro.

Unfortunately, this Node.js hack no longer seems to work in Atom v1.17. There’s some kind of additional module caching happening that I can’t circumvent.

My question is: is there a way to set up Jison so that multiple, independent lexers can be used? I’ve tried doing this with Jison Lex alone and by using new Parser() in Jison, but everything seems to use the same lexer object.

code inspection: parser kernel doesn't hand over correctly from error recovery phase

Following up on #21, while doing a bit of code inspection on my own work through copy-edit/git-compare/diff via Beyond Compare, I noticed that the parser kernel has a few very subtle bugs in the error recovery parse loop: as the parse loop is duplicated in the error recovery section (so that I can optimize the main parse loop for regular operations and not worry about the special handling that is required for error recovery there), it does not hand over / drop out into the outer parse loop correctly:

  • when parseError() produces a 'parser return value', that one is DESTROYED in the ACCEPT phase of the outer loop.

  • the outer loop can be further optimized when it doesn't have to worry about a still-active 'recovery phase', i.e. recovering === 0 should be a precondition in the outer parse loop.

  • when (edge case) the lexer also is in the habit of producing TERROR tokens (some of my grammars do this), then we will loose their yyval and yylloc! Hence we must differentiate between a TERROR as a replacement token set up in the parser kernel error recovery section and a TERROR token produced by the lexer: the latter is an error token too, but should only indirectly trigger error recovery by the parser.

    Hint To Self: this means that an error term in a grammar production has an associated value which is either a parser error recovery info object or a lexer-produced yyvalue, depending on whether the lexer TERROR-or-other token triggered parser error recovery or not! ... Talk about complex internals... 🀑

Update readme

Regarding the documentation of this fork. I think the installation command shown in the README should be changed, as npm install jison -g appears but should be npm install jison-gho -g.
I have seen that this fork contains many improvements over the original project, but I am convinced that this fork is the one that everyone should install.

I created this issue because at first I didn't know how to install this fork, so I took some time to find out what the installation command was. He always installed the original because it was the one that appeared in the documentation of this fork.

Thanks.

Generate lexer/parser in different programming language

Output, for example, code in TypeScript. Or maybe even C. πŸ˜„

Way to do this as a general solution: output the lexer and parser tables, plus user action code chunks, as an object/JSON file, which can be picked up by a simple postprocessor/template engine, which can then generate said lexer/parser in the user's choice of language.

The alternative is going back to bison and learning M4... πŸ˜’

Global variable leak

Hello.
It seems that variable stack_pointer used here is not defined and hence leaks on window.
Regards,
Manuel

Problem when detecting multiple token

I am doing a basic transpiler, it consists in translating certain syntax with already defined grammar. I still have some drawbacks when defining the detection of multiple tokens. That is, using * to specify the repetition of a regex token (I think the * is used to define the repetition of a token), for example (see SENTENCE*):

FUNCTION
    : DEF ID PAR_OPEN PAR_CLOSE
        SENTENCE*
      END
        { $$ = 'function ' + $2 + '(){' + $5 + '}' };

SENTENCE
    : PRINT
    | VAR_ASSIGN
    |;

Grammar works with the input:

def hello()
    println "dsasd"
end

but it does not work with the entry:

def hello()
    a = 3
    println "dsasd"
end

The error thrown is:

Error: Parse error on line 3:
...llo()    a = 3    println "dsasd"end
---------------------^
Expecting 'EOF', '+', 'OR', 'AND', '=', '<>', '-', '*', '/', '>', '<', '>=', '<=','^', 'PAR_CLOSE', '%', 'END', got 'PRINTLN'

Could you tell me what I'm doing wrong?

See Gist complete code

action code in grammar doesn't expand `@label` ref inside template string with comments

Offending bnf.y grammar excerpt where @action after the @0 IS NOT expanded into proper JavaScript code like it should:

handle_action
    : handle prec action
        {
            $$ = [($handle.length ? $handle.join(' ') : '')];
            if ($action) {
                var rv = checkActionBlock($action, @action);
                if (rv) {
                    yyerror(rmCommonWS`
                        production rule action code block does not compile: ${rv}

                          Erroneous area:
                        ${yylexer.prettyPrintRange(@action, @handle)}
                    `);
                }
                $$.push($action);
            }
            if ($prec) {
                if ($handle.length === 0) {
                    yyerror(rmCommonWS`
                        You cannot specify a precedence override for an epsilon (a.k.a. empty) rule!

                          Erroneous area:
                        ${yylexer.prettyPrintRange(@handle, @0 /* @handle is very probably NULL! We need this one for some decent location info! */, @action /* ditto! */)}
                    `);
                }
                $$.push($prec);
            }
            if ($$.length === 1) {
                $$ = $$[0];
            }
        }

Removing the comment immediately following @0 fixes the problem.

This must be a problem with my code rewriter logic in there; it's still regex-based and apparently goes bonkers when you mix comments with ES6 string templates. Grmbl. I was planning to migrate to using recast AST rewriting already, but that will take some effort / time that is in short supply. :-(

"Erroneous area: ..." display is empty on error

... while I expect some minimal source code dump there.

Example of the failure to dump source as part of the error report(s):

node ../dist/cli-cjs-es5.js -o ./output/olmenu-proto2/ --main ./olmenu-proto2.jison

EBNF: ignoring unsupported parser option "%type  <str> filename "
while lexing in "INITIAL" state.

  Erroneous area:


EBNF: ignoring unsupported parser option "%type  <str> label"
while lexing in "INITIAL" state.

  Erroneous area:


EBNF: ignoring unsupported parser option "%type  <str> menu_command "
while lexing in "INITIAL" state.

  Erroneous area:

`yylexer.prettyPrintRange()` API: very long prelude is not shortened with (...continued...) a la error area

Found while quickly looking at the GnuCobol parser.y grammar.

Expected behaviour SHOULD be a '(...continued...)' prelude for such edge cases, a la the second error dump example below.

Notice the extremely long prelude dumping the entire (irrelevant) setup %{...%} code chunk here while reporting on the %token EOF 0 ... error:

            throw err;
            ^

JisonParserError:
declaration list error?

  Erroneous area:
  21: %expect 0
  22:
  23: %defines
  24: %verbose
  25: %error-verbose
  26:
  27: %{
  28: // #include "config.h"
  29: //
  30: // #include <stdlib.h>
  31: // #include <string.h>
  32: //
  33: // #define COB_IN_PARSER 1
  34: // #include "cobc.h"
  35: // #include "tree.h"
  36: //
  37: // #ifndef _STDLIB_H
  38: // #define _STDLIB_H 1
  39: // #endif
  40: //
  41: // #define YYSTYPE   cb_tree
  42: // #define yyerror(x)  cb_error ("%s", x)
  43: //
  44: // #define emit_statement(x) \
  45: // do { \
  46: //   if (!skip_statements) { \
  47: //  CB_ADD_TO_CHAIN (x, current_program->exec_list); \
  48: //   } \
  49: // }  ONCE_COB
  50: //
  51: // #define push_expr(type, node) \
  52: //   current_expr = cb_build_list (cb_int (type), node, current_expr)
  53: //
  54: // /* Statement terminator definitions */
  55: // #define TERM_NONE  0
  56: // #define TERM_ACCEPT  1U
  57: // #define TERM_ADD  2U
  58: // #define TERM_CALL  3U
  59: // #define TERM_COMPUTE  4U
  60: // #define TERM_DELETE  5U
  61: // #define TERM_DISPLAY  6U
  62: // #define TERM_DIVIDE  7U
  63: // #define TERM_EVALUATE  8U
  64: // #define TERM_IF   9U
  65: // #define TERM_MULTIPLY  10U
  66: // #define TERM_PERFORM  11U
  67: // #define TERM_READ  12U
  68: // #define TERM_RECEIVE  13U
  69: // #define TERM_RETURN  14U
  70: // #define TERM_REWRITE  15U
  71: // #define TERM_SEARCH  16U
  72: // #define TERM_START  17U
  73: // #define TERM_STRING  18U
  74: // #define TERM_SUBTRACT  19U
  75: // #define TERM_UNSTRING  20U
  76: // #define TERM_WRITE  21U
  77: // #define TERM_MAX  22U /* Always last entry, used for array size */
  78: //
  79: // #define TERMINATOR_WARNING(x,z) terminator_warning (x, TERM_##z, #z)
  80: // #define TERMINATOR_ERROR(x,z) terminator_error (x, TERM_##z, #z)
  81: // #define TERMINATOR_CLEAR(x,z) terminator_clear (x, TERM_##z)
  82: //
  83: // /* Defines for duplicate checks */
  84: // /* Note - We use <= 16 for common item definitons and */
  85: // /* > 16 for non-common item definitions e.g. REPORT and SCREEN */
  86: // #define SYN_CLAUSE_1  (1U << 0)
  87: // #define SYN_CLAUSE_2  (1U << 1)
  88: // #define SYN_CLAUSE_3  (1U << 2)
  89: // #define SYN_CLAUSE_4  (1U << 3)
  90: // #define SYN_CLAUSE_5  (1U << 4)
  91: // #define SYN_CLAUSE_6  (1U << 5)
  92: // #define SYN_CLAUSE_7  (1U << 6)
  93: // #define SYN_CLAUSE_8  (1U << 7)
  94: // #define SYN_CLAUSE_9  (1U << 8)
  95: // #define SYN_CLAUSE_10  (1U << 9)
  96: // #define SYN_CLAUSE_11  (1U << 10)
  97: // #define SYN_CLAUSE_12  (1U << 11)
  98: // #define SYN_CLAUSE_13  (1U << 12)
  99: // #define SYN_CLAUSE_14  (1U << 13)
 100: // #define SYN_CLAUSE_15  (1U << 14)
 101: // #define SYN_CLAUSE_16  (1U << 15)
 102: // #define SYN_CLAUSE_17  (1U << 16)
 103: // #define SYN_CLAUSE_18  (1U << 17)
 104: // #define SYN_CLAUSE_19  (1U << 18)
 105: // #define SYN_CLAUSE_20  (1U << 19)
 106: // #define SYN_CLAUSE_21  (1U << 20)
 107: // #define SYN_CLAUSE_22  (1U << 21)
 108: // #define SYN_CLAUSE_23  (1U << 22)
 109: // #define SYN_CLAUSE_24  (1U << 23)
 110: // #define SYN_CLAUSE_25  (1U << 24)
 111: // #define SYN_CLAUSE_26  (1U << 25)
 112: // #define SYN_CLAUSE_27  (1U << 26)
 113: // #define SYN_CLAUSE_28  (1U << 27)
 114: // #define SYN_CLAUSE_29  (1U << 28)
 115: // #define SYN_CLAUSE_30  (1U << 29)
 116: // #define SYN_CLAUSE_31  (1U << 30)
 117: // #define SYN_CLAUSE_32  (1U << 31)
 118: //
 119: // #define EVAL_DEPTH  32
 120: // #define PROG_DEPTH  16
 121: //
 122: // /* Global variables */
 123: //
 124: // struct cb_program  *current_program = NULL;
 125: // struct cb_statement  *current_statement = NULL;
 126: // struct cb_label   *current_section = NULL;
 127: // struct cb_label   *current_paragraph = NULL;
 128: // cb_tree    defined_prog_list = NULL;
 129: // int    cb_exp_line = 0;
 130: //
 131: // cb_tree    cobc_printer_node = NULL;
 132: // int    functions_are_all = 0;
 133: // int    non_const_word = 0;
 134: // int    suppress_data_exceptions = 0;
 135: // int    call_line_number;
 136: // unsigned int   cobc_repeat_last_token = 0;
 137: // unsigned int   cobc_in_id = 0;
 138: // unsigned int   cobc_in_procedure = 0;
 139: // unsigned int   cobc_in_repository = 0;
 140: // unsigned int   cobc_force_literal = 0;
 141: // unsigned int   cobc_cs_check = 0;
 142: // unsigned int   cobc_allow_program_name = 0;
 143: //
 144: // /* Local variables */
 145: //
 146: // enum tallying_phrase {
 147: //  NO_PHRASE,
 148: //  FOR_PHRASE,
 149: //  CHARACTERS_PHRASE,
 150: //  ALL_LEADING_TRAILING_PHRASES,
 151: //  VALUE_REGION_PHRASE
 152: // };
 153: //
 154: // static struct cb_statement *main_statement;
 155: //
 156: // static cb_tree   current_expr;
 157: // static struct cb_field  *current_field;
 158: // static struct cb_field  *description_field;
 159: // static struct cb_file  *current_file;
 160: // static struct cb_cd  *current_cd;
 161: // static struct cb_report  *current_report;
 162: // static struct cb_report  *report_instance;
 163: //
 164: // static struct cb_file  *linage_file;
 165: // static cb_tree   next_label_list;
 166: //
 167: // static char   *stack_progid[PROG_DEPTH];
 168: //
 169: // static enum cb_storage  current_storage;
 170: //
 171: // static cb_tree   perform_stack;
 172: // static cb_tree   qualifier;
 173: // static cb_tree   keys_list;
 174: //
 175: // static cb_tree   save_tree;
 176: // static cb_tree   start_tree;
 177: //
 178: // static unsigned int  check_unreached;
 179: // static unsigned int  in_declaratives;
 180: // static unsigned int  in_debugging;
 181: // static unsigned int  current_linage;
 182: // static unsigned int  report_count;
 183: // static unsigned int  first_prog;
 184: // static unsigned int  setup_from_identification;
 185: // static unsigned int  use_global_ind;
 186: // static unsigned int  same_area;
 187: // static unsigned int  inspect_keyword;
 188: // static unsigned int  main_flag_set;
 189: // static int   next_label_id;
 190: // static int   eval_level;
 191: // static int   eval_inc;
 192: // static int   eval_inc2;
 193: // static int   depth;
 194: // static int   first_nested_program;
 195: // static int   call_mode;
 196: // static int   size_mode;
 197: // static cob_flags_t  set_attr_val_on;
 198: // static cob_flags_t  set_attr_val_off;
 199: // static cob_flags_t  check_duplicate;
 200: // static cob_flags_t  check_on_off_duplicate;
 201: // static cob_flags_t  check_pic_duplicate;
 202: // static cob_flags_t  check_line_col_duplicate;
 203: // static unsigned int  skip_statements;
 204: // static unsigned int  start_debug;
 205: // static unsigned int  save_debug;
 206: // static unsigned int  needs_field_debug;
 207: // static unsigned int  needs_debug_item;
 208: // static unsigned int  env_div_seen;
 209: // static cob_flags_t  header_check;
 210: // static unsigned int  call_nothing;
 211: // static enum tallying_phrase previous_tallying_phrase;
 212: // static cb_tree   default_rounded_mode;
 213: //
 214: // static enum cb_display_type display_type;
 215: // static int   is_first_display_item;
 216: // static cb_tree   advancing_value;
 217: // static cb_tree   upon_value;
 218: // static cb_tree   line_column;
 219: //
 220: // static int   term_array[TERM_MAX];
 221: // static cb_tree   eval_check[EVAL_DEPTH][EVAL_DEPTH];
 222: //
 223: // /* Defines for header presence */
 224: //
 225: // #define COBC_HD_ENVIRONMENT_DIVISION (1U << 0)
 226: // #define COBC_HD_CONFIGURATION_SECTION (1U << 1)
 227: // #define COBC_HD_SPECIAL_NAMES  (1U << 2)
 228: // #define COBC_HD_INPUT_OUTPUT_SECTION (1U << 3)
 229: // #define COBC_HD_FILE_CONTROL  (1U << 4)
 230: // #define COBC_HD_I_O_CONTROL  (1U << 5)
 231: // #define COBC_HD_DATA_DIVISION  (1U << 6)
 232: // #define COBC_HD_FILE_SECTION  (1U << 7)
 233: // #define COBC_HD_WORKING_STORAGE_SECTION (1U << 8)
 234: // #define COBC_HD_LOCAL_STORAGE_SECTION (1U << 9)
 235: // #define COBC_HD_LINKAGE_SECTION  (1U << 10)
 236: // #define COBC_HD_COMMUNICATION_SECTION (1U << 11)
 237: // #define COBC_HD_REPORT_SECTION  (1U << 12)
 238: // #define COBC_HD_SCREEN_SECTION  (1U << 13)
 239: // #define COBC_HD_PROCEDURE_DIVISION (1U << 14)
 240: // #define COBC_HD_PROGRAM_ID  (1U << 15)
 241: //
 242: // /* Static functions */
 243: //
 244: // static void
 245: // begin_statement (const char *name, const unsigned int term)
 246: // {
 247: //  if (check_unreached) {
 248: //   cb_warning (cb_warn_unreachable, _("unreachable statement '%s'"), name);
 249: //  }
 250: //  current_paragraph->flag_statement = 1;
 251: //  current_statement = cb_build_statement (name);
 252: //  CB_TREE (current_statement)->source_file = cb_source_file;
 253: //  CB_TREE (current_statement)->source_line = cb_source_line;
 254: //  current_statement->flag_in_debug = in_debugging;
 255: //  emit_statement (CB_TREE (current_statement));
 256: //  if (term) {
 257: //   term_array[term]++;
 258: //  }
 259: //  main_statement = current_statement;
 260: // }
 261: //
 262: // /* create a new statement with base attributes of current_statement
 263: //    and set this as new current_statement */
 264: // static void
 265: // begin_implicit_statement (void)
 266: // {
 267: //  struct cb_statement *new_statement;
 268: //  new_statement = cb_build_statement (NULL);
 269: //  new_statement->common = current_statement->common;
 270: //  new_statement->name = current_statement->name;
 271: //  new_statement->flag_in_debug = !!in_debugging;
 272: //  current_statement = new_statement;
 273: //  main_statement->body = cb_list_add (main_statement->body,
 274: //          CB_TREE (current_statement));
 275: // }
 276: //
 277: // # if 0 /* activate only for debugging purposes for attribs */
 278: // static
 279: // void print_bits (cob_flags_t num)
 280: // {
 281: //  unsigned int  size = sizeof (cob_flags_t);
 282: //  unsigned int max_pow = 1 << (size * 8 - 1);
 283: //  int   i = 0;
 284: //
 285: //  for(; i < size * 8; ++i){
 286: //   /* Print last bit and shift left. */
 287: //   fprintf (stderr, "%u ", num & max_pow ? 1 : 0);
 288: //   num = num << 1;
 289: //  }
 290: //  fprintf (stderr, "\n");
 291: // }
 292: // #endif
 293: //
 294: // static void
 295: // emit_entry (const char *name, const int encode, cb_tree using_list, cb_tree convention)
 296: // {
 297: //  cb_tree  l;
 298: //  cb_tree  label;
 299: //  cb_tree  x;
 300: //  cb_tree  entry_conv;
 301: //  struct cb_field *f, *ret_f;
 302: //  int   param_num;
 303: //  char  buff[COB_MINI_BUFF];
 304: //
 305: //  snprintf (buff, (size_t)COB_MINI_MAX, "E$%s", name);
 306: //  label = cb_build_label (cb_build_reference (buff), NULL);
 307: //  if (encode) {
 308: //   CB_LABEL (label)->name = cb_encode_program_id (name);
 309: //   CB_LABEL (label)->orig_name = name;
 310: //  } else {
 311: //   CB_LABEL (label)->name = name;
 312: //   CB_LABEL (label)->orig_name = current_program->orig_program_id;
 313: //  }
 314: //  CB_LABEL (label)->flag_begin = 1;
 315: //  CB_LABEL (label)->flag_entry = 1;
 316: //  label->source_file = cb_source_file;
 317: //  label->source_line = cb_source_line;
 318: //  emit_statement (label);
 319: //
 320: //  if (current_program->flag_debugging) {
 321: //   emit_statement (cb_build_debug (cb_debug_contents,
 322: //       "START PROGRAM", NULL));
 323: //  }
 324: //
 325: //  param_num = 1;
 326: //  for (l = using_list; l; l = CB_CHAIN (l)) {
 327: //   x = CB_VALUE (l);
 328: //   if (CB_VALID_TREE (x) && cb_ref (x) != cb_error_node) {
 329: //    f = CB_FIELD (cb_ref (x));
 330: //    if (!current_program->flag_chained) {
 331: //     if (f->storage != CB_STORAGE_LINKAGE) {
 332: //      cb_error_x (x, _("'%s' is not in LINKAGE SECTION"), f->name);
 333: //     }
 334: //     if (f->flag_item_based || f->flag_external) {
 335: //      cb_error_x (x, _("'%s' cannot be BASED/EXTERNAL"), f->name);
 336: //     }
 337: //     f->flag_is_pdiv_parm = 1;
 338: //    } else {
 339: //     if (f->storage != CB_STORAGE_WORKING) {
 340: //      cb_error_x (x, _("'%s' is not in WORKING-STORAGE SECTION"), f->name);
 341: //     }
 342: //     f->flag_chained = 1;
 343: //     f->param_num = param_num;
 344: //     param_num++;
 345: //    }
 346: //    if (f->level != 01 && f->level != 77) {
 347: //     cb_error_x (x, _("'%s' not level 01 or 77"), f->name);
 348: //    }
 349: //    if (f->redefines) {
 350: //     cb_error_x (x, _ ("'%s' REDEFINES field not allowed here"), f->name);
 351: //    }
 352: //    /* add a "receiving" entry for the USING parameter */
 353: //    if (cb_listing_xref) {
 354: //     cobc_xref_link (&f->xref, CB_REFERENCE (x)->common.source_line, 1);
 355: //    }
 356: //   }
 357: //  }
 358: //
 359: //
 360: //  if (current_program->returning &&
 361: //   cb_ref (current_program->returning) != cb_error_node) {
 362: //   ret_f = CB_FIELD (cb_ref (current_program->returning));
 363: //   if (ret_f->redefines) {
 364: //    cb_error_x (current_program->returning, _("'%s' REDEFINES field not allowed here"), ret_f->name);
 365: //   }
 366: //  } else {
 367: //   ret_f = NULL;
 368: //  }
 369: //
 370: //  /* Check dangling LINKAGE items */
 371: //  if (cb_warn_linkage) {
 372: //   for (f = current_program->linkage_storage; f; f = f->sister) {
 373: //    if (f == ret_f) {
 374: //     continue;
 375: //    }
 376: //    for (l = using_list; l; l = CB_CHAIN (l)) {
 377: //     x = CB_VALUE (l);
 378: //     if (CB_VALID_TREE (x) && cb_ref (x) != cb_error_node) {
 379: //      if (f == CB_FIELD (cb_ref (x))) {
 380: //       break;
 381: //      }
 382: //     }
 383: //    }
 384: //    if (!l && !f->redefines) {
 385: //     cb_warning (cb_warn_linkage, _("LINKAGE item '%s' is not a PROCEDURE USING parameter"), f->name);
 386: //    }
 387: //   }
 388: //  }
 389: //
 390: //  /* Check returning item against using items when FUNCTION */
 391: //  if (current_program->prog_type == CB_FUNCTION_TYPE && current_program->returning) {
 392: //   for (l = using_list; l; l = CB_CHAIN (l)) {
 393: //    x = CB_VALUE (l);
 394: //    if (CB_VALID_TREE (x) && cb_ref (x) != cb_error_node) {
 395: //     f = CB_FIELD (cb_ref (x));
 396: //     if (ret_f == f) {
 397: //      cb_error_x (x, _("'%s' USING item duplicates RETURNING item"), f->name);
 398: //     }
 399: //    }
 400: //   }
 401: //  }
 402: //
 403: //  for (l = current_program->entry_list; l; l = CB_CHAIN (l)) {
 404: //   if (strcmp ((const char *)name,
 405: //        (const char *)(CB_LABEL(CB_PURPOSE(l))->name)) == 0) {
 406: //    cb_error_x (CB_TREE (current_statement),
 407: //         _("ENTRY '%s' duplicated"), name);
 408: //   }
 409: //  }
 410: //
 411: //  if (convention) {
 412: //   entry_conv = convention;
 413: //  } else {
 414: //   entry_conv = current_program->entry_convention;
 415: //  }
 416: //
 417: //  current_program->entry_list =
 418: //   cb_list_append (current_program->entry_list,
 419: //     CB_BUILD_PAIR (label, CB_BUILD_PAIR(entry_conv, using_list)));
 420: // }
 421: //
 422: // static size_t
 423: // increment_depth (void)
 424: // {
 425: //  if (++depth >= PROG_DEPTH) {
 426: //   cb_error (_("maximum nested program depth exceeded (%d)"),
 427: //      PROG_DEPTH);
 428: //   return 1;
 429: //  }
 430: //  return 0;
 431: // }
 432: //
 433: // static void
 434: // terminator_warning (cb_tree stmt, const unsigned int termid,
 435: //       const char *name)
 436: // {
 437: //  char  terminator[32];
 438: //
 439: //  check_unreached = 0;
 440: //  if (term_array[termid]) {
 441: //   term_array[termid]--;
 442: //  /* LCOV_EXCL_START */
 443: //  } else {
 444: //   cobc_err_msg ("call to '%s' without any open term for %s",
 445: //    "terminator_warning", name);
 446: //   COBC_ABORT ();
 447: //  }
 448: //  /* LCOV_EXCL_END */
 449: //  snprintf (terminator, 32, "END-%s", name);
 450: //  if (is_reserved_word (terminator)) {
 451: //   cb_warning_x (cb_warn_terminator, CB_TREE (current_statement),
 452: //    _("%s statement not terminated by %s"), name, terminator);
 453: //  }
 454: //
 455: //  /* Free tree associated with terminator */
 456: //  if (stmt) {
 457: //   cobc_parse_free (stmt);
 458: //  }
 459: // }
 460: //
 461: // static void
 462: // terminator_error (cb_tree stmt, const unsigned int termid, const char *name)
 463: // {
 464: //  char  terminator[32];
 465: //
 466: //  check_unreached = 0;
 467: //  if (term_array[termid]) {
 468: //   term_array[termid]--;
 469: //  /* LCOV_EXCL_START */
 470: //  } else {
 471: //   cobc_err_msg ("call to '%s' without any open term for %s",
 472: //    "terminator_error", name);
 473: //   COBC_ABORT ();
 474: //  }
 475: //  /* LCOV_EXCL_END */
 476: //  snprintf (terminator, 32, "END-%s", name);
 477: //  if (is_reserved_word (terminator)) {
 478: //   cb_error_x (CB_TREE (current_statement),
 479: //    _("%s statement not terminated by %s"), name, terminator);
 480: //  } else {
 481: //   cb_error_x (CB_TREE (current_statement),
 482: //    _("%s statement not terminated"), name);
 483: //  }
 484: //
 485: //  /* Free tree associated with terminator */
 486: //  if (stmt) {
 487: //   cobc_parse_free (stmt);
 488: //  }
 489: // }
 490: //
 491: // static void
 492: // terminator_clear (cb_tree stmt, const unsigned int termid)
 493: // {
 494: //  struct cb_perform *p;
 495: //  check_unreached = 0;
 496: //  if (term_array[termid]) {
 497: //   term_array[termid]--;
 498: //  /* LCOV_EXCL_START */
 499: //  } else {
 500: //   cobc_err_msg ("call to '%s' without any open term for %s",
 501: //    "terminator_warning", current_statement->name);
 502: //   COBC_ABORT ();
 503: //  }
 504: //  /* LCOV_EXCL_END */
 505: //  if (termid == TERM_PERFORM
 506: //   && perform_stack) {
 507: //   p = CB_PERFORM (CB_VALUE (perform_stack));
 508: //   if (p->perform_type == CB_PERFORM_UNTIL) {
 509: //    cb_terminate_cond ();
 510: //   }
 511: //  }
 512: //  /* Free tree associated with terminator */
 513: //  if (stmt) {
 514: //   cobc_parse_free (stmt);
 515: //  }
 516: // }
 517: //
 518: // static int
 519: // literal_value (cb_tree x)
 520: // {
 521: //  if (x == cb_space) {
 522: //   return ' ';
 523: //  } else if (x == cb_zero) {
 524: //   return '0';
 525: //  } else if (x == cb_quote) {
 526: //   return cb_flag_apostrophe ? '\'' : '"';
 527: //  } else if (x == cb_null) {
 528: //   return 0;
 529: //  } else if (x == cb_low) {
 530: //   return 0;
 531: //  } else if (x == cb_high) {
 532: //   return 255;
 533: //  } else if (CB_TREE_CLASS (x) == CB_CLASS_NUMERIC) {
 534: //   return cb_get_int (x);
 535: //  } else {
 536: //   return CB_LITERAL (x)->data[0];
 537: //  }
 538: // }
 539: //
 540: // static void
 541: // setup_use_file (struct cb_file *fileptr)
 542: // {
 543: //  struct cb_file *newptr;
 544: //
 545: //  if (fileptr->organization == COB_ORG_SORT) {
 546: //   cb_error (_("USE statement invalid for SORT file"));
 547: //  }
 548: //  if (fileptr->flag_global) {
 549: //   newptr = cobc_parse_malloc (sizeof(struct cb_file));
 550: //   *newptr = *fileptr;
 551: //   newptr->handler = current_section;
 552: //   newptr->handler_prog = current_program;
 553: //   if (!use_global_ind) {
 554: //    current_program->local_file_list =
 555: //     cb_list_add (current_program->local_file_list,
 556: //           CB_TREE (newptr));
 557: //   } else {
 558: //    current_program->global_file_list =
 559: //     cb_list_add (current_program->global_file_list,
 560: //           CB_TREE (newptr));
 561: //   }
 562: //  } else {
 563: //   fileptr->handler = current_section;
 564: //  }
 565: // }
 566: //
 567: // static void
 568: // emit_duplicate_clause_message (const char *clause)
 569: // {
 570: //  /* FIXME: replace by a new warning level that is set
 571: //     to warn/error depending on cb_relaxed_syntax_checks */
 572: //  if (cb_relaxed_syntax_checks) {
 573: //   cb_warning (COBC_WARN_FILLER, _("duplicate %s clause"), clause);
 574: //  } else {
 575: //   cb_error (_("duplicate %s clause"), clause);
 576: //  }
 577: // }
 578: //
 579: // static void
 580: // check_repeated (const char *clause, const cob_flags_t bitval, cob_flags_t *already_seen)
 581: // {
 582: //  if (*already_seen & bitval) {
 583: //   emit_duplicate_clause_message (clause);
 584: //  } else {
 585: //   *already_seen |= bitval;
 586: //  }
 587: // }
 588: //
 589: // static void
 590: // setup_occurs (void)
 591: // {
 592: //  check_repeated ("OCCURS", SYN_CLAUSE_7, &check_pic_duplicate);
 593: //  if (current_field->indexes == COB_MAX_SUBSCRIPTS) {
 594: //   cb_error (_ ("maximum OCCURS depth exceeded (%d)"),
 595: //    COB_MAX_SUBSCRIPTS);
 596: //  } else {
 597: //   current_field->indexes++;
 598: //  }
 599: //
 600: //  if (current_field->flag_unbounded) {
 601: //   if (current_field->storage != CB_STORAGE_LINKAGE) {
 602: //    cb_error_x (CB_TREE(current_field), _("'%s' is not in LINKAGE SECTION"),
 603: //     cb_name (CB_TREE(current_field)));
 604: //   }
 605: //  }
 606: //
 607: //  if (current_field->flag_item_based) {
 608: //   cb_error (_ ("%s and %s are mutually exclusive"), "BASED", "OCCURS");
 609: //  } else if (current_field->flag_external) {
 610: //   cb_error (_ ("%s and %s are mutually exclusive"), "EXTERNAL", "OCCURS");
 611: //  }
 612: //  current_field->flag_occurs = 1;
 613: // }
 614: //
 615: // static void
 616: // setup_occurs_min_max (cb_tree occurs_min, cb_tree occurs_max)
 617: // {
 618: //  if (occurs_max) {
 619: //   current_field->occurs_min = cb_get_int (occurs_min);
 620: //   if (occurs_max != cb_int0) {
 621: //    current_field->occurs_max = cb_get_int (occurs_max);
 622: //    if (!current_field->depending) {
 623: //     if (cb_relaxed_syntax_checks) {
 624: //      cb_warning (COBC_WARN_FILLER, _ ("TO phrase without DEPENDING phrase"));
 625: //      cb_warning (COBC_WARN_FILLER, _ ("maximum number of occurences assumed to be exact number"));
 626: //      current_field->occurs_min = 1; /* CHECKME: why using 1 ? */
 627: //     } else {
 628: //      cb_error (_ ("TO phrase without DEPENDING phrase"));
 629: //     }
 630: //    }
 631: //    if (current_field->occurs_max <= current_field->occurs_min) {
 632: //     cb_error (_ ("OCCURS TO must be greater than OCCURS FROM"));
 633: //    }
 634: //   } else {
 635: //    current_field->occurs_max = 0;
 636: //   }
 637: //  } else {
 638: //   current_field->occurs_min = 1; /* CHECKME: why using 1 ? */
 639: //   current_field->occurs_max = cb_get_int (occurs_min);
 640: //   if (current_field->depending) {
 641: //    cb_verify (cb_odo_without_to, _ ("ODO without TO phrase"));
 642: //   }
 643: //  }
 644: // }
 645: //
 646: // static void
 647: // check_relaxed_syntax (const cob_flags_t lev)
 648: // {
 649: //  const char *s;
 650: //
 651: //  switch (lev) {
 652: //  case COBC_HD_ENVIRONMENT_DIVISION:
 653: //   s = "ENVIRONMENT DIVISION";
 654: //   break;
 655: //  case COBC_HD_CONFIGURATION_SECTION:
 656: //   s = "CONFIGURATION SECTION";
 657: //   break;
 658: //  case COBC_HD_SPECIAL_NAMES:
 659: //   s = "SPECIAL-NAMES";
 660: //   break;
 661: //  case COBC_HD_INPUT_OUTPUT_SECTION:
 662: //   s = "INPUT-OUTPUT SECTION";
 663: //   break;
 664: //  case COBC_HD_FILE_CONTROL:
 665: //   s = "FILE-CONTROL";
 666: //   break;
 667: //  case COBC_HD_I_O_CONTROL:
 668: //   s = "I-O-CONTROL";
 669: //   break;
 670: //  case COBC_HD_DATA_DIVISION:
 671: //   s = "DATA DIVISION";
 672: //   break;
 673: //  case COBC_HD_FILE_SECTION:
 674: //   s = "FILE SECTION";
 675: //   break;
 676: //  case COBC_HD_WORKING_STORAGE_SECTION:
 677: //   s = "WORKING-STORAGE SECTION";
 678: //   break;
 679: //  case COBC_HD_LOCAL_STORAGE_SECTION:
 680: //   s = "LOCAL-STORAGE SECTION";
 681: //   break;
 682: //  case COBC_HD_LINKAGE_SECTION:
 683: //   s = "LINKAGE SECTION";
 684: //   break;
 685: //  case COBC_HD_COMMUNICATION_SECTION:
 686: //   s = "COMMUNICATION SECTION";
 687: //   break;
 688: //  case COBC_HD_REPORT_SECTION:
 689: //   s = "REPORT SECTION";
 690: //   break;
 691: //  case COBC_HD_SCREEN_SECTION:
 692: //   s = "SCREEN SECTION";
 693: //   break;
 694: //  case COBC_HD_PROCEDURE_DIVISION:
 695: //   s = "PROCEDURE DIVISION";
 696: //   break;
 697: //  case COBC_HD_PROGRAM_ID:
 698: //   s = "PROGRAM-ID";
 699: //   break;
 700: //  default:
 701: //   s = "Unknown";
 702: //   break;
 703: //  }
 704: //  if (cb_relaxed_syntax_checks) {
 705: //   cb_warning (COBC_WARN_FILLER, _("%s header missing - assumed"), s);
 706: //  } else {
 707: //   cb_error (_("%s header missing"), s);
 708: //  }
 709: // }
 710: //
 711: // /* check if headers are present - return 0 if fine, 1 if missing
 712: //    Lev1 must always be present and is checked
 713: //    Lev2/3/4, if non-zero (forced) may be present
 714: // */
 715: // static int
 716: // check_headers_present (const cob_flags_t lev1, const cob_flags_t lev2,
 717: //          const cob_flags_t lev3, const cob_flags_t lev4)
 718: // {
 719: //  int ret = 0;
 720: //  if (!(header_check & lev1)) {
 721: //   header_check |= lev1;
 722: //   check_relaxed_syntax (lev1);
 723: //   ret = 1;
 724: //  }
 725: //  if (lev2) {
 726: //   if (!(header_check & lev2)) {
 727: //    header_check |= lev2;
 728: //    check_relaxed_syntax (lev2);
 729: //    ret = 1;
 730: //   }
 731: //  }
 732: //  if (lev3) {
 733: //   if (!(header_check & lev3)) {
 734: //    header_check |= lev3;
 735: //    check_relaxed_syntax (lev3);
 736: //    ret = 1;
 737: //   }
 738: //  }
 739: //  if (lev4) {
 740: //   if (!(header_check & lev4)) {
 741: //    header_check |= lev4;
 742: //    check_relaxed_syntax (lev4);
 743: //    ret = 1;
 744: //   }
 745: //  }
 746: //  return ret;
 747: // }
 748: //
 749: // static void
 750: // build_nested_special (const int ndepth)
 751: // {
 752: //  cb_tree  x;
 753: //  cb_tree  y;
 754: //
 755: //  if (!ndepth) {
 756: //   return;
 757: //  }
 758: //
 759: //  /* Inherit special name mnemonics from parent */
 760: //  for (x = current_program->mnemonic_spec_list; x; x = CB_CHAIN (x)) {
 761: //   y = cb_build_reference (cb_name(CB_PURPOSE(x)));
 762: //   if (CB_SYSTEM_NAME_P (CB_VALUE(x))) {
 763: //    cb_define (y, CB_VALUE(x));
 764: //   } else {
 765: //    cb_build_constant (y, CB_VALUE(x));
 766: //   }
 767: //  }
 768: // }
 769: //
 770: // static void
 771: // clear_initial_values (void)
 772: // {
 773: //  perform_stack = NULL;
 774: //  current_statement = NULL;
 775: //  main_statement = NULL;
 776: //  qualifier = NULL;
 777: //  in_declaratives = 0;
 778: //  in_debugging = 0;
 779: //  use_global_ind = 0;
 780: //  check_duplicate = 0;
 781: //  check_pic_duplicate = 0;
 782: //  skip_statements = 0;
 783: //  start_debug = 0;
 784: //  save_debug = 0;
 785: //  needs_field_debug = 0;
 786: //  needs_debug_item = 0;
 787: //  env_div_seen = 0;
 788: //  header_check = 0;
 789: //  next_label_id = 0;
 790: //  current_linage = 0;
 791: //  set_attr_val_on = 0;
 792: //  set_attr_val_off = 0;
 793: //  report_count = 0;
 794: //  current_storage = CB_STORAGE_WORKING;
 795: //  eval_level = 0;
 796: //  eval_inc = 0;
 797: //  eval_inc2 = 0;
 798: //  inspect_keyword = 0;
 799: //  check_unreached = 0;
 800: //  cobc_in_id = 0;
 801: //  cobc_in_procedure = 0;
 802: //  cobc_in_repository = 0;
 803: //  cobc_force_literal = 0;
 804: //  non_const_word = 0;
 805: //  suppress_data_exceptions = 0;
 806: //  same_area = 1;
 807: //  memset ((void *)eval_check, 0, sizeof(eval_check));
 808: //  memset ((void *)term_array, 0, sizeof(term_array));
 809: //  linage_file = NULL;
 810: //  current_file = NULL;
 811: //  current_cd = NULL;
 812: //  current_report = NULL;
 813: //  report_instance = NULL;
 814: //  next_label_list = NULL;
 815: //  default_rounded_mode = cb_int (COB_STORE_ROUND);
 816: // }
 817: //
 818: // /*
 819: //   We must check for redefinitions of program-names and external program names
 820: //   outside of the usual reference/word_list methods as it may have to be done in
 821: //   a case-sensitive way.
 822: // */
 823: // static void
 824: // begin_scope_of_program_name (struct cb_program *program)
 825: // {
 826: //  const char *prog_name = program->program_name;
 827: //  const char *prog_id = program->orig_program_id;
 828: //  const char *elt_name;
 829: //  const char *elt_id;
 830: //  cb_tree  l;
 831: //
 832: //  /* Error if a program with the same name has been defined. */
 833: //  for (l = defined_prog_list; l; l = CB_CHAIN (l)) {
 834: //   elt_name = ((struct cb_program *) CB_VALUE (l))->program_name;
 835: //   elt_id = ((struct cb_program *) CB_VALUE (l))->orig_program_id;
 836: //   if (cb_fold_call && strcasecmp (prog_name, elt_name) == 0) {
 837: //    cb_error_x ((cb_tree) program,
 838: //         _("redefinition of program name '%s'"),
 839: //         elt_name);
 840: //   } else if (strcmp (prog_id, elt_id) == 0) {
 841: //           cb_error_x ((cb_tree) program,
 842: //         _("redefinition of program ID '%s'"),
 843: //         elt_id);
 844: //    return;
 845: //   }
 846: //  }
 847: //
 848: //  /* Otherwise, add the program to the list. */
 849: //  defined_prog_list = cb_list_add (defined_prog_list,
 850: //       (cb_tree) program);
 851: // }
 852: //
 853: // static void
 854: // remove_program_name (struct cb_list *l, struct cb_list *prev)
 855: // {
 856: //  if (prev == NULL) {
 857: //   defined_prog_list = l->chain;
 858: //  } else {
 859: //   prev->chain = l->chain;
 860: //  }
 861: //  cobc_parse_free (l);
 862: // }
 863: //
 864: // /* Remove the program from defined_prog_list, if necessary. */
 865: // static void
 866: // end_scope_of_program_name (struct cb_program *program, const unsigned char type)
 867: // {
 868: //  struct cb_list *prev = NULL;
 869: //  struct cb_list *l = (struct cb_list *) defined_prog_list;
 870: //
 871: //  /* create empty entry if the program has no PROCEDURE DIVISION, error for UDF */
 872: //  if (!program->entry_list) {
 873: //   if (type == CB_FUNCTION_TYPE) {
 874: //    cb_error (_("FUNCTION '%s' has no PROCEDURE DIVISION"), program->program_name);
 875: //   } else {
 876: //    emit_entry (program->program_id, 0, NULL, NULL);
 877: //   }
 878: //  }
 879: //
 880: //  if (program->nested_level == 0) {
 881: //   return;
 882: //  }
 883: //
 884: //  /* Remove any subprograms */
 885: //  l = CB_LIST (defined_prog_list);
 886: //  while (l) {
 887: //   if (CB_PROGRAM (l->value)->nested_level > program->nested_level) {
 888: //    remove_program_name (l, prev);
 889: //   } else {
 890: //    prev = l;
 891: //   }
 892: //   if (prev && prev->chain != NULL) {
 893: //    l = CB_LIST (prev->chain);
 894: //   } else {
 895: //    l = NULL;
 896: //   }
 897: //  }
 898: //
 899: //  /* Remove the specified program, if it is not COMMON */
 900: //  if (!program->flag_common) {
 901: //   l = (struct cb_list *) defined_prog_list;
 902: //   while (l) {
 903: //    if (strcmp (program->orig_program_id,
 904: //         CB_PROGRAM (l->value)->orig_program_id)
 905: //        == 0) {
 906: //     remove_program_name (l, prev);
 907: //     if (prev && prev->chain != NULL) {
 908: //      l = CB_LIST (prev->chain);
 909: //     } else {
 910: //      l = NULL;
 911: //     }
 912: //     break;
 913: //    } else {
 914: //     prev = l;
 915: //     if (l->chain != NULL) {
 916: //      l = CB_LIST (l->chain);
 917: //     } else {
 918: //      l = NULL;
 919: //     }
 920: //    }
 921: //   }
 922: //  }
 923: // }
 924: //
 925: // static void
 926: // setup_program_start (void)
 927: // {
 928: //  if (setup_from_identification) {
 929: //   setup_from_identification = 0;
 930: //   return;
 931: //  }
 932: //  current_section = NULL;
 933: //  current_paragraph = NULL;
 934: //
 935: //  if (depth != 0 && first_nested_program) {
 936: //   check_headers_present (COBC_HD_PROCEDURE_DIVISION, 0, 0, 0);
 937: //  }
 938: //  first_nested_program = 1;
 939: // }
 940: //
 941: // static int
 942: // setup_program (cb_tree id, cb_tree as_literal, const unsigned char type)
 943: // {
 944: //  setup_program_start ();
 945: //
 946: //  if (first_prog) {
 947: //   first_prog = 0;
 948: //  } else {
 949: //   if (!current_program->flag_validated) {
 950: //    current_program->flag_validated = 1;
 951: //    cb_validate_program_body (current_program);
 952: //   }
 953: //
 954: //   clear_initial_values ();
 955: //   current_program = cb_build_program (current_program, depth);
 956: //   build_nested_special (depth);
 957: //   cb_set_intr_when_compiled ();
 958: //   cb_build_registers ();
 959: //  }
 960: //
 961: //  if (CB_LITERAL_P (id)) {
 962: //   stack_progid[depth] = (char *)(CB_LITERAL (id)->data);
 963: //  } else {
 964: //   stack_progid[depth] = (char *)(CB_NAME (id));
 965: //  }
 966: //
 967: //  if (depth != 0 && type == CB_FUNCTION_TYPE) {
 968: //   cb_error (_("functions may not be defined within a program/function"));
 969: //  }
 970: //
 971: //  if (increment_depth ()) {
 972: //   return 1;
 973: //  }
 974: //
 975: //  current_program->program_id = cb_build_program_id (id, as_literal, type == CB_FUNCTION_TYPE);
 976: //  current_program->prog_type = type;
 977: //
 978: //  if (type == CB_PROGRAM_TYPE) {
 979: //   if (!main_flag_set) {
 980: //    main_flag_set = 1;
 981: //    current_program->flag_main = !!cobc_flag_main;
 982: //   }
 983: //  } else { /* CB_FUNCTION_TYPE */
 984: //   current_program->flag_recursive = 1;
 985: //  }
 986: //
 987: //  if (CB_REFERENCE_P (id)) {
 988: //          cb_define (id, CB_TREE (current_program));
 989: //  }
 990: //
 991: //  begin_scope_of_program_name (current_program);
 992: //
 993: //  return 0;
 994: // }
 995: //
 996: // static void
 997: // decrement_depth (const char *name, const unsigned char type)
 998: // {
 999: //  int d;
1000: //
1001: //  if (depth) {
1002: //   depth--;
1003: //  }
1004: //
1005: //  if (!strcmp (stack_progid[depth], name)) {
1006: //   return;
1007: //  }
1008: //
1009: //  if (type == CB_FUNCTION_TYPE) {
1010: //   cb_error (_("END FUNCTION '%s' is different from FUNCTION-ID '%s'"),
1011: //      name, stack_progid[depth]);
1012: //   return;
1013: //  }
1014: //
1015: //  /* Set depth to that of whatever program we just ended, if it exists. */
1016: //  for (d = depth; d >= 0; --d) {
1017: //   if (!strcmp (stack_progid[d], name)) {
1018: //    depth = d;
1019: //    return;
1020: //   }
1021: //  }
1022: //
1023: //  if (depth != d) {
1024: //   cb_error (_("END PROGRAM '%s' is different from PROGRAM-ID '%s'"),
1025: //      name, stack_progid[depth]);
1026: //  }
1027: // }
1028: //
1029: // static void
1030: // clean_up_program (cb_tree name, const unsigned char type)
1031: // {
1032: //  char  *s;
1033: //
1034: //  end_scope_of_program_name (current_program, type);
1035: //
1036: //  if (name) {
1037: //   if (CB_LITERAL_P (name)) {
1038: //    s = (char *)(CB_LITERAL (name)->data);
1039: //   } else {
1040: //    s = (char *)(CB_NAME (name));
1041: //   }
1042: //
1043: //   decrement_depth (s, type);
1044: //  }
1045: //
1046: //  current_section = NULL;
1047: //  current_paragraph = NULL;
1048: //  if (!current_program->flag_validated) {
1049: //   current_program->flag_validated = 1;
1050: //   cb_validate_program_body (current_program);
1051: //  }
1052: // }
1053: //
1054: // static const char *
1055: // get_literal_or_word_name (const cb_tree x)
1056: // {
1057: //  if (CB_LITERAL_P (x)) {
1058: //   return (const char *) CB_LITERAL (x)->data;
1059: //  } else { /* CB_REFERENCE_P (x) */
1060: //   return (const char *) CB_NAME (x);
1061: //  }
1062: // }
1063: //
1064: // /* verify and set picture sign for currency */
1065: // static void
1066: // set_currency_picture_symbol (const cb_tree x)
1067: // {
1068: //  unsigned char *s  = CB_LITERAL (x)->data;
1069: //
1070: //  if (CB_LITERAL (x)->size != 1) {
1071: //   cb_error_x (x, _("PICTURE SYMBOL for CURRENCY must be one character long"));
1072: //   return;
1073: //  }
1074: //  switch (*s) {
1075: //  case '0':
1076: //  case '1':
1077: //  case '2':
1078: //  case '3':
1079: //  case '4':
1080: //  case '5':
1081: //  case '6':
1082: //  case '7':
1083: //  case '8':
1084: //  case '9':
1085: //  case 'A':
1086: //  case 'B':
1087: //  case 'C':
1088: //  case 'D':
1089: //  case 'E':
1090: //  case 'N':
1091: //  case 'P':
1092: //  case 'R':
1093: //  case 'S':
1094: //  case 'V':
1095: //  case 'X':
1096: //  case 'Z':
1097: //  case 'a':
1098: //  case 'b':
1099: //  case 'c':
1100: //  case 'd':
1101: //  case 'e':
1102: //  case 'n':
1103: //  case 'p':
1104: //  case 'r':
1105: //  case 's':
1106: //  case 'v':
1107: //  case 'x':
1108: //  case 'z':
1109: //  case '+':
1110: //  case '-':
1111: //  case ',':
1112: //  case '.':
1113: //  case '*':
1114: //  case '/':
1115: //  case ';':
1116: //  case '(':
1117: //  case ')':
1118: //  case '=':
1119: //  case '\'':
1120: //  case '"':
1121: //  case ' ':
1122: //   cb_error_x (x, _("invalid character '%c' in PICTURE SYMBOL for CURRENCY"), s[0]);
1123: //   return;
1124: //  default:
1125: //   break;
1126: //  }
1127: //  current_program->currency_symbol = s[0];
1128: // }
1129: //
1130: // /* Return 1 if the prototype name is the same as the current function's. */
1131: // static int
1132: // check_prototype_redefines_current_element (const cb_tree prototype_name)
1133: // {
1134: //  const char *name = get_literal_or_word_name (prototype_name);
1135: //
1136: //  if (strcasecmp (name, current_program->program_name) == 0) {
1137: //   cb_warning_x (COBC_WARN_FILLER, prototype_name,
1138: //    _("prototype has same name as current function and will be ignored"));
1139: //   return 1;
1140: //  }
1141: //
1142: //  return 0;
1143: // }
1144: //
1145: // /* Returns 1 if the prototype has been duplicated. */
1146: // static int
1147: // check_for_duplicate_prototype (const cb_tree prototype_name,
1148: //           const cb_tree prototype)
1149: // {
1150: //  cb_tree dup;
1151: //
1152: //  if (CB_WORD_COUNT (prototype_name) > 0) {
1153: //   /* Make sure the duplicate is a prototype */
1154: //   dup = cb_ref (prototype_name);
1155: //   if (!CB_PROTOTYPE_P (dup)) {
1156: //    redefinition_error (prototype_name);
1157: //    return 1;
1158: //   }
1159: //
1160: //   /* Check the duplicate prototypes match */
1161: //   if (strcmp (CB_PROTOTYPE (prototype)->ext_name,
1162: //        CB_PROTOTYPE (dup)->ext_name)
1163: //       || CB_PROTOTYPE (prototype)->type != CB_PROTOTYPE (dup)->type) {
1164: //    cb_error_x (prototype_name,
1165: //         _("duplicate REPOSITORY entries for '%s' do not match"),
1166: //         get_literal_or_word_name (prototype_name));
1167: //   } else {
1168: //    cb_warning_x (COBC_WARN_FILLER, prototype_name,
1169: //           _("duplicate REPOSITORY entry for '%s'"),
1170: //           get_literal_or_word_name (prototype_name));
1171: //   }
1172: //   return 1;
1173: //  }
1174: //
1175: //  return 0;
1176: // }
1177: //
1178: // static void
1179: // setup_prototype (cb_tree prototype_name, cb_tree ext_name,
1180: //     const int type, const int is_current_element)
1181: // {
1182: //  cb_tree prototype;
1183: //  int name_redefinition_allowed;
1184: //
1185: //  if (!is_current_element
1186: //      && check_prototype_redefines_current_element (prototype_name)) {
1187: //   return;
1188: //  }
1189: //
1190: //  prototype = cb_build_prototype (prototype_name, ext_name, type);
1191: //
1192: //  if (!is_current_element
1193: //      && check_for_duplicate_prototype (prototype_name, prototype)) {
1194: //   return;
1195: //  }
1196: //
1197: //  name_redefinition_allowed = type == CB_PROGRAM_TYPE
1198: //   && is_current_element && cb_program_name_redefinition;
1199: //  if (!name_redefinition_allowed) {
1200: //   if (CB_LITERAL_P (prototype_name)) {
1201: //    cb_define (cb_build_reference ((const char *)CB_LITERAL (prototype_name)->data), prototype);
1202: //   } else {
1203: //    cb_define (prototype_name, prototype);
1204: //   }
1205: //
1206: //   if (type == CB_PROGRAM_TYPE) {
1207: //    current_program->program_spec_list =
1208: //     cb_list_add (current_program->program_spec_list, prototype);
1209: //   } else { /* CB_FUNCTION_TYPE */
1210: //    current_program->user_spec_list =
1211: //     cb_list_add (current_program->user_spec_list, prototype);
1212: //   }
1213: //  }
1214: // }
1215: //
1216: // static void
1217: // error_if_invalid_level_for_renames (cb_tree item)
1218: // {
1219: //  int level = CB_FIELD (cb_ref (item))->level;
1220: //
1221: //  if (level == 1 || level == 66 || level == 77) {
1222: //          cb_verify (cb_renames_uncommon_levels,
1223: //       _("RENAMES of 01-, 66- and 77-level items"));
1224: //  } else if (level == 88) {
1225: //   cb_error (_("RENAMES may not reference a level 88"));
1226: //  }
1227: // }
1228: //
1229: // static int
1230: // set_current_field (cb_tree level, cb_tree name)
1231: // {
1232: //  cb_tree x  = cb_build_field_tree (level, name, current_field,
1233: //        current_storage, current_file, 0);
1234: //  cobc_parse_free (level);
1235: //
1236: //  if (CB_INVALID_TREE (x)) {
1237: //          return 1;
1238: //  } else {
1239: //   current_field = CB_FIELD (x);
1240: //   check_pic_duplicate = 0;
1241: //  }
1242: //
1243: //  return 0;
1244: // }
1245: //
1246: // static void
1247: // check_not_both (const cob_flags_t flag1, const cob_flags_t flag2,
1248: //   const char *flag1_name, const char *flag2_name,
1249: //   const cob_flags_t flags, const cob_flags_t flag_to_set)
1250: // {
1251: //  if (flag_to_set == flag1 && (flags & flag2)) {
1252: //   cb_error (_("cannot specify both %s and %s"),
1253: //      flag1_name, flag2_name);
1254: //  } else if (flag_to_set == flag2 && (flags & flag1)) {
1255: //   cb_error (_("cannot specify both %s and %s"),
1256: //      flag1_name, flag2_name);
1257: //
1258: //  }
1259: // }
1260: //
1261: // static COB_INLINE COB_A_INLINE void
1262: // check_not_highlight_and_lowlight (const cob_flags_t flags,
1263: //       const cob_flags_t flag_to_set)
1264: // {
1265: //  check_not_both (COB_SCREEN_HIGHLIGHT, COB_SCREEN_LOWLIGHT,
1266: //    "HIGHLIGHT", "LOWLIGHT", flags, flag_to_set);
1267: // }
1268: //
1269: // static void
1270: // set_screen_attr (const char *clause, const cob_flags_t bitval)
1271: // {
1272: //  if (current_field->screen_flag & bitval) {
1273: //   emit_duplicate_clause_message (clause);
1274: //  } else {
1275: //   current_field->screen_flag |= bitval;
1276: //  }
1277: // }
1278: //
1279: // static void
1280: // emit_conflicting_clause_message (const char *clause, const char *conflicting)
1281: // {
1282: //  if (cb_relaxed_syntax_checks) {
1283: //   cb_warning (COBC_WARN_FILLER, _("cannot specify both %s and %s; %s is ignored"),
1284: //    clause, conflicting, clause);
1285: //  } else {
1286: //   cb_error (_("cannot specify both %s and %s"),
1287: //    clause, conflicting);
1288: //  }
1289: //
1290: // }
1291: //
1292: // static void
1293: // set_attr_with_conflict (const char *clause, const cob_flags_t bitval,
1294: //    const char *confl_clause, const cob_flags_t confl_bit,
1295: //    const int local_check_duplicate, cob_flags_t *flags)
1296: // {
1297: //  if (local_check_duplicate && (*flags & bitval)) {
1298: //   emit_duplicate_clause_message (clause);
1299: //  } else if (*flags & confl_bit) {
1300: //   emit_conflicting_clause_message (clause, confl_clause);
1301: //  } else {
1302: //  *flags |= bitval;
1303: //  }
1304: // }
1305: //
1306: // static COB_INLINE COB_A_INLINE void
1307: // set_screen_attr_with_conflict (const char *clause, const cob_flags_t bitval,
1308: //           const char *confl_clause,
1309: //           const cob_flags_t confl_bit)
1310: // {
1311: //  set_attr_with_conflict (clause, bitval, confl_clause, confl_bit, 1,
1312: //     &current_field->screen_flag);
1313: // }
1314: //
1315: // static COB_INLINE COB_A_INLINE int
1316: // has_dispattr (const cob_flags_t attrib)
1317: // {
1318: //  return current_statement->attr_ptr
1319: //   && current_statement->attr_ptr->dispattrs & attrib;
1320: // }
1321: //
1322: // static void
1323: // attach_attrib_to_cur_stmt (void)
1324: // {
1325: //  if (!current_statement->attr_ptr) {
1326: //   current_statement->attr_ptr =
1327: //    cobc_parse_malloc (sizeof(struct cb_attr_struct));
1328: //  }
1329: // }
1330: //
1331: // static COB_INLINE COB_A_INLINE void
1332: // set_dispattr (const cob_flags_t attrib)
1333: // {
1334: //  attach_attrib_to_cur_stmt ();
1335: //  current_statement->attr_ptr->dispattrs |= attrib;
1336: // }
1337: //
1338: // static COB_INLINE COB_A_INLINE void
1339: // set_dispattr_with_conflict (const char *attrib_name, const cob_flags_t attrib,
1340: //        const char *confl_name,
1341: //        const cob_flags_t confl_attrib)
1342: // {
1343: //  attach_attrib_to_cur_stmt ();
1344: //  set_attr_with_conflict (attrib_name, attrib, confl_name, confl_attrib, 0,
1345: //     &current_statement->attr_ptr->dispattrs);
1346: // }
1347: //
1348: // static void
1349: // bit_set_attr (const cb_tree on_off, const cob_flags_t attr_val)
1350: // {
1351: //  if (on_off == cb_int1) {
1352: //   set_attr_val_on |= attr_val;
1353: //  } else {
1354: //   set_attr_val_off |= attr_val;
1355: //  }
1356: // }
1357: //
1358: // static void
1359: // set_field_attribs (cb_tree fgc, cb_tree bgc, cb_tree scroll,
1360: //      cb_tree timeout, cb_tree prompt, cb_tree size_is)
1361: // {
1362: //  /* [WITH] FOREGROUND-COLOR [IS] */
1363: //  if (fgc) {
1364: //   current_statement->attr_ptr->fgc = fgc;
1365: //  }
1366: //  /* [WITH] BACKGROUND-COLOR [IS] */
1367: //  if (bgc) {
1368: //   current_statement->attr_ptr->bgc = bgc;
1369: //  }
1370: //  /* [WITH] SCROLL UP | DOWN */
1371: //  if (scroll) {
1372: //   current_statement->attr_ptr->scroll = scroll;
1373: //  }
1374: //  /* [WITH] TIME-OUT [AFTER] */
1375: //  if (timeout) {
1376: //   current_statement->attr_ptr->timeout = timeout;
1377: //  }
1378: //  /* [WITH] PROMPT CHARACTER [IS] */
1379: //  if (prompt) {
1380: //   current_statement->attr_ptr->prompt = prompt;
1381: //  }
1382: //  /* [WITH] SIZE [IS] */
1383: //  if (size_is) {
1384: //   current_statement->attr_ptr->size_is = size_is;
1385: //  }
1386: // }
1387: //
1388: // static void
1389: // set_attribs (cb_tree fgc, cb_tree bgc, cb_tree scroll,
1390: //       cb_tree timeout, cb_tree prompt, cb_tree size_is,
1391: //       const cob_flags_t attrib)
1392: // {
1393: //  attach_attrib_to_cur_stmt ();
1394: //  set_field_attribs (fgc, bgc, scroll, timeout, prompt, size_is);
1395: //
1396: //  current_statement->attr_ptr->dispattrs |= attrib;
1397: // }
1398: //
1399: // static void
1400: // set_attribs_with_conflict  (cb_tree fgc, cb_tree bgc, cb_tree scroll,
1401: //        cb_tree timeout, cb_tree prompt, cb_tree size_is,
1402: //        const char *clause_name, const cob_flags_t attrib,
1403: //        const char *confl_name, const cob_flags_t confl_attrib)
1404: // {
1405: //  attach_attrib_to_cur_stmt ();
1406: //  set_field_attribs (fgc, bgc, scroll, timeout, prompt, size_is);
1407: //
1408: //  set_dispattr_with_conflict (clause_name, attrib, confl_name,
1409: //         confl_attrib);
1410: // }
1411: //
1412: // static cob_flags_t
1413: // zero_conflicting_flag (const cob_flags_t screen_flag, cob_flags_t parent_flag,
1414: //     const cob_flags_t flag1, const cob_flags_t flag2)
1415: // {
1416: //  if (screen_flag & flag1) {
1417: //   parent_flag &= ~flag2;
1418: //  } else if (screen_flag & flag2) {
1419: //   parent_flag &= ~flag1;
1420: //  }
1421: //
1422: //  return parent_flag;
1423: // }
1424: //
1425: // static cob_flags_t
1426: // zero_conflicting_flags (const cob_flags_t screen_flag, cob_flags_t parent_flag)
1427: // {
1428: //  parent_flag = zero_conflicting_flag (screen_flag, parent_flag,
1429: //           COB_SCREEN_BLANK_LINE,
1430: //           COB_SCREEN_BLANK_SCREEN);
1431: //  parent_flag = zero_conflicting_flag (screen_flag, parent_flag,
1432: //           COB_SCREEN_ERASE_EOL,
1433: //           COB_SCREEN_ERASE_EOS);
1434: //  parent_flag = zero_conflicting_flag (screen_flag, parent_flag,
1435: //           COB_SCREEN_HIGHLIGHT,
1436: //           COB_SCREEN_LOWLIGHT);
1437: //
1438: //  return parent_flag;
1439: // }
1440: //
1441: // static void
1442: // check_and_set_usage (const enum cb_usage usage)
1443: // {
1444: //  check_repeated ("USAGE", SYN_CLAUSE_5, &check_pic_duplicate);
1445: //  current_field->usage = usage;
1446: // }
1447: //
1448: // static void
1449: // check_preceding_tallying_phrases (const enum tallying_phrase phrase)
1450: // {
1451: //  switch (phrase) {
1452: //  case FOR_PHRASE:
1453: //   if (previous_tallying_phrase == ALL_LEADING_TRAILING_PHRASES) {
1454: //    cb_error (_("FOR phrase cannot immediately follow ALL/LEADING/TRAILING"));
1455: //   } else if (previous_tallying_phrase == FOR_PHRASE) {
1456: //    cb_error (_("missing CHARACTERS/ALL/LEADING/TRAILING phrase after FOR phrase"));
1457: //   }
1458: //   break;
1459: //
1460: //  case CHARACTERS_PHRASE:
1461: //  case ALL_LEADING_TRAILING_PHRASES:
1462: //   if (previous_tallying_phrase == NO_PHRASE) {
1463: //    cb_error (_("missing FOR phrase before CHARACTERS/ALL/LEADING/TRAILING phrase"));
1464: //   } else if (previous_tallying_phrase == CHARACTERS_PHRASE
1465: //       || previous_tallying_phrase == ALL_LEADING_TRAILING_PHRASES) {
1466: //    cb_error (_("missing value between CHARACTERS/ALL/LEADING/TRAILING words"));
1467: //   }
1468: //   break;
1469: //
1470: //  case VALUE_REGION_PHRASE:
1471: //   if (!(previous_tallying_phrase == ALL_LEADING_TRAILING_PHRASES
1472: //         || previous_tallying_phrase == VALUE_REGION_PHRASE)) {
1473: //    cb_error (_("missing ALL/LEADING/TRAILING before value"));
1474: //   }
1475: //   break;
1476: //
1477: //   /* LCOV_EXCL_START */
1478: //  default:
1479: //   /* This should never happen (and therefore doesn't get a translation) */
1480: //   cb_error ("unexpected tallying phrase");
1481: //   COBC_ABORT();
1482: //   /* LCOV_EXCL_END */
1483: //  }
1484: //
1485: //  previous_tallying_phrase = phrase;
1486: // }
1487: //
1488: // static int
1489: // has_relative_pos (struct cb_field const *field)
1490: // {
1491: //  return !!(field->screen_flag
1492: //     & (COB_SCREEN_LINE_PLUS | COB_SCREEN_LINE_MINUS
1493: //        | COB_SCREEN_COLUMN_PLUS | COB_SCREEN_COLUMN_MINUS));
1494: // }
1495: //
1496: // static int
1497: // is_recursive_call (cb_tree target)
1498: // {
1499: //  const char *target_name = "";
1500: //
1501: //  if (CB_LITERAL_P (target)) {
1502: //   target_name = (const char *)(CB_LITERAL(target)->data);
1503: //  } else if (CB_REFERENCE_P (target)
1504: //      && CB_PROTOTYPE_P (cb_ref (target))) {
1505: //   target_name = CB_PROTOTYPE (cb_ref (target))->ext_name;
1506: //  }
1507: //
1508: //  return !strcmp (target_name, current_program->orig_program_id);
1509: // }
1510: //
1511: // static void
1512: // check_not_88_level (cb_tree x)
1513: // {
1514: //  struct cb_field *f;
1515: //
1516: //  if (x == cb_error_node || x->tag != CB_TAG_REFERENCE) {
1517: //   return;
1518: //  }
1519: //
1520: //  f = CB_FIELD (cb_ref (x));
1521: //
1522: //  if (f != (struct cb_field *) cb_error_node && f->level == 88) {
1523: //   cb_error (_("88-level cannot be used here"));
1524: //  }
1525: // }
1526: //
1527: // static int
1528: // is_screen_field (cb_tree x)
1529: // {
1530: //  if (CB_FIELD_P (x)) {
1531: //   return (CB_FIELD (x))->storage == CB_STORAGE_SCREEN;
1532: //  } else if (CB_REFERENCE_P (x)) {
1533: //   return is_screen_field (cb_ref (x));
1534: //  } else {
1535: //   return 0;
1536: //  }
1537: // }
1538: //
1539: // static void
1540: // error_if_no_advancing_in_screen_display (cb_tree advancing)
1541: // {
1542: //  if (advancing != cb_int1) {
1543: //   cb_error (_("cannot specify NO ADVANCING in screen DISPLAY"));
1544: //  }
1545: // }
1546: //
1547: // static cb_tree
1548: // get_default_display_device (void)
1549: // {
1550: //  if (current_program->flag_console_is_crt
1551: //      || cb_console_is_crt) {
1552: //   return cb_null;
1553: //  } else {
1554: //   return cb_int0;
1555: //  }
1556: // }
1557: //
1558: // static COB_INLINE COB_A_INLINE int
1559: // contains_one_screen_field (struct cb_list *x_list)
1560: // {
1561: //  return (cb_tree) x_list != cb_null
1562: //   && cb_list_length ((cb_tree) x_list) == 1
1563: //   && is_screen_field (x_list->value);
1564: // }
1565: //
1566: // static int
1567: // contains_only_screen_fields (struct cb_list *x_list)
1568: // {
1569: //  if ((cb_tree) x_list == cb_null) {
1570: //   return 0;
1571: //  }
1572: //
1573: //  for (; x_list; x_list = (struct cb_list *) x_list->chain) {
1574: //   if (!is_screen_field (x_list->value)) {
1575: //    return 0;
1576: //   }
1577: //  }
1578: //
1579: //  return 1;
1580: // }
1581: //
1582: // static int
1583: // contains_fields_and_screens (struct cb_list *x_list)
1584: // {
1585: //  int field_seen = 0;
1586: //  int screen_seen = 0;
1587: //
1588: //  if ((cb_tree) x_list == cb_null) {
1589: //   return 0;
1590: //  }
1591: //
1592: //  for (; x_list; x_list = (struct cb_list *) x_list->chain) {
1593: //   if (is_screen_field (x_list->value)) {
1594: //    screen_seen = 1;
1595: //   } else {
1596: //    field_seen = 1;
1597: //   }
1598: //  }
1599: //
1600: //  return screen_seen && field_seen;
1601: // }
1602: //
1603: // static enum cb_display_type
1604: // deduce_display_type (cb_tree x_list, cb_tree local_upon_value, cb_tree local_line_column,
1605: //        struct cb_attr_struct * const attr_ptr)
1606: // {
1607: //  int using_default_device_which_is_crt =
1608: //   local_upon_value == NULL && get_default_display_device () == cb_null;
1609: //
1610: //  if (contains_only_screen_fields ((struct cb_list *) x_list)) {
1611: //   if (!contains_one_screen_field ((struct cb_list *) x_list)
1612: //       || attr_ptr) {
1613: //    cb_verify_x (x_list, cb_accept_display_extensions,
1614: //          _("non-standard DISPLAY"));
1615: //   }
1616: //
1617: //   if (local_upon_value != NULL && local_upon_value != cb_null) {
1618: //    cb_error_x (x_list, _("screens may only be displayed on CRT"));
1619: //   }
1620: //
1621: //   return SCREEN_DISPLAY;
1622: //  } else if (contains_fields_and_screens ((struct cb_list *) x_list)) {
1623: //   cb_error_x (x_list, _("cannot mix screens and fields in the same DISPLAY statement"));
1624: //   return MIXED_DISPLAY;
1625: //  } else if (local_line_column || attr_ptr) {
1626: //   if (local_upon_value != NULL && local_upon_value != cb_null) {
1627: //    cb_error_x (x_list, _("screen clauses may only be used for DISPLAY on CRT"));
1628: //   }
1629: //
1630: //   cb_verify_x (x_list, cb_accept_display_extensions,
1631: //         _("non-standard DISPLAY"));
1632: //
1633: //   return FIELD_ON_SCREEN_DISPLAY;
1634: //  } else if (local_upon_value == cb_null || using_default_device_which_is_crt) {
1635: //   /* This is the only format permitted by the standard */
1636: //   return FIELD_ON_SCREEN_DISPLAY;
1637: //  } else if (display_type == FIELD_ON_SCREEN_DISPLAY && local_upon_value == NULL) {
1638: //   /* This is for when fields without clauses follow fields with screen clauses */
1639: //   return FIELD_ON_SCREEN_DISPLAY;
1640: //  } else {
1641: //   return DEVICE_DISPLAY;
1642: //  }
1643: // }
1644: //
1645: // static void
1646: // set_display_type (cb_tree x_list, cb_tree local_upon_value,
1647: //     cb_tree local_line_column, struct cb_attr_struct * const attr_ptr)
1648: // {
1649: //  display_type = deduce_display_type (x_list, local_upon_value, local_line_column, attr_ptr);
1650: // }
1651: //
1652: // static void
1653: // error_if_different_display_type (cb_tree x_list, cb_tree local_upon_value,
1654: //      cb_tree local_line_column, struct cb_attr_struct * const attr_ptr)
1655: // {
1656: //         const enum cb_display_type type =
1657: //   deduce_display_type (x_list, local_upon_value, local_line_column, attr_ptr);
1658: //
1659: //  /* Avoid re-displaying the same error for mixed DISPLAYs */
1660: //  if (type == display_type || display_type == MIXED_DISPLAY) {
1661: //   return;
1662: //  }
1663: //
1664: //  if (type != MIXED_DISPLAY) {
1665: //   if (type == SCREEN_DISPLAY || display_type == SCREEN_DISPLAY) {
1666: //    cb_error_x (x_list, _("cannot mix screens and fields in the same DISPLAY statement"));
1667: //   } else {
1668: //    /*
1669: //      The only other option is that there is a mix of
1670: //      FIELD_ON_SCREEN_DISPLAY and DEVICE_DISPLAY.
1671: //    */
1672: //    cb_error_x (x_list, _("ambiguous DISPLAY; put items to display on device in separate DISPLAY"));
1673: //   }
1674: //  }
1675: //
1676: //  display_type = MIXED_DISPLAY;
1677: // }
1678: //
1679: // static void
1680: // error_if_not_usage_display_or_nonnumeric_lit (cb_tree x)
1681: // {
1682: //  const int is_numeric_literal = CB_NUMERIC_LITERAL_P (x);
1683: //  const int is_field_with_usage_not_display =
1684: //   CB_REFERENCE_P (x) && CB_FIELD (cb_ref (x))
1685: //   && CB_FIELD (cb_ref (x))->usage != CB_USAGE_DISPLAY;
1686: //
1687: //  if (is_numeric_literal) {
1688: //   cb_error_x (x, _("%s is not an alphanumeric literal"), CB_LITERAL (x)->data);
1689: //  } else if (is_field_with_usage_not_display) {
1690: //   cb_error_x (x, _("'%s' is not USAGE DISPLAY"), cb_name (x));
1691: //  }
1692: // }
1693: //
1694: %}
1695:
1696: %token TOKEN_EOF 0 "end of file"
^^^^...................^
1697:
1698: %token ACCEPT

    at Object.parseError (W:\Users\Ger\Projects\sites\library.visyond.gov\80\lib\tooling\jison\dist\cli-cjs.js:20353:15)
    at Object.yyError (W:\Users\Ger\Projects\sites\library.visyond.gov\80\lib\tooling\jison\dist\cli-cjs.js:20509:26)
    at Object.parser__PerformAction (W:\Users\Ger\Projects\sites\library.visyond.gov\80\lib\tooling\jison\dist\cli-cjs.js:17687:14)
    at Object.parse (W:\Users\Ger\Projects\sites\library.visyond.gov\80\lib\tooling\jison\dist\cli-cjs.js:21537:52)
    at Object.parse (W:\Users\Ger\Projects\sites\library.visyond.gov\80\lib\tooling\jison\dist\cli-cjs.js:24377:23)
    at autodetectAndConvertToJSONformat (W:\Users\Ger\Projects\sites\library.visyond.gov\80\lib\tooling\jison\dist\cli-cjs.js:25559:32)
    at new Jison_Generator (W:\Users\Ger\Projects\sites\library.visyond.gov\80\lib\tooling\jison\dist\cli-cjs.js:33713:15)
    at Object.generateParserString (W:\Users\Ger\Projects\sites\library.visyond.gov\80\lib\tooling\jison\dist\cli-cjs.js:34263:25)
    at processInputFile (W:\Users\Ger\Projects\sites\library.visyond.gov\80\lib\tooling\jison\dist\cli-cjs.js:34174:30)
    at Object.cliMain [as main] (W:\Users\Ger\Projects\sites\library.visyond.gov\80\lib\tooling\jison\dist\cli-cjs.js:34252:13)

And here's a (...continued...) example for reference:

                throw err;
                ^

JisonParserError:
production rule action code block does not compile: Line 634: Unexpected token ILLEGAL

  Erroneous area:
613:     : handle prec action
614:         {
^^^..........^^
615:             $$ = [($handle.length ? $handle.join(' ') : '')];
^^^..^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
     (...continued...)
---  (---------------)
641:             }
^^^..^^^^^^^^^^^^^^
642:         }
^^^..^^^^^^^^^
643:     | EPSILON action
644:         // %epsilon may only be used to signal this is an empty rule alt;

    at Object.parseError (W:\Users\Ger\Projects\sites\library.visyond.gov\80\lib\tooling\jison\dist\cli-cjs-es5.js:14408:19)
    at Object.yyError (W:\Users\Ger\Projects\sites\library.visyond.gov\80\lib\tooling\jison\dist\cli-cjs-es5.js:14547:30)
    at Object.parser__PerformAction (W:\Users\Ger\Projects\sites\library.visyond.gov\80\lib\tooling\jison\dist\cli-cjs-es5.js:14057:34)
    at Object.parse (W:\Users\Ger\Projects\sites\library.visyond.gov\80\lib\tooling\jison\dist\cli-cjs-es5.js:15518:48)
    at Object.parse (W:\Users\Ger\Projects\sites\library.visyond.gov\80\lib\tooling\jison\dist\cli-cjs-es5.js:17891:23)
    at autodetectAndConvertToJSONformat (W:\Users\Ger\Projects\sites\library.visyond.gov\80\lib\tooling\jison\dist\cli-cjs-es5.js:18965:36)
    at new Jison_Generator (W:\Users\Ger\Projects\sites\library.visyond.gov\80\lib\tooling\jison\dist\cli-cjs-es5.js:24550:15)
    at Object.generateParserString (W:\Users\Ger\Projects\sites\library.visyond.gov\80\lib\tooling\jison\dist\cli-cjs-es5.js:25066:25)
    at processInputFile (W:\Users\Ger\Projects\sites\library.visyond.gov\80\lib\tooling\jison\dist\cli-cjs-es5.js:24977:30)
    at Object.cliMain [as main] (W:\Users\Ger\Projects\sites\library.visyond.gov\80\lib\tooling\jison\dist\cli-cjs-es5.js:25055:13)
make[1]: *** [Makefile:33: build] Error 1

Change in error handling between 0.4.18-153 and 0.4.18-154

It looks like there was a change in error handling between 0.4.18-153 and 0.4.18-154. I’m not sure if I’m doing something wrong, or if this is unexpected and I should try to reduce this.

I use Jison in linter-csound. On macOS, if I enter in Terminal

git clone https://github.com/nwhetsell/linter-csound.git
cd linter-csound/lib/csound-parser
npm install https://github.com/GerHobbelt/jison/archive/0.4.18-153.tar.gz
../../node_modules/jison-lex/cli.js preprocessor.jison-lex --outfile preprocessor.js
../../node_modules/jison/lib/cli.js orchestra.jison orchestra.jison-lex --outfile orchestra-parser.js
npm install csound-api
npm --global install jasmine
jasmine

to generate parsers and run some tests with Jison 0.4.18-153, all the tests pass. (If you want to run this, note that the csound-api package requires Boost and Csound. Here are installation instructions.)

If I then run

npm uninstall jison
npm install https://github.com/GerHobbelt/jison/archive/0.4.18-154.tar.gz
../../node_modules/jison-lex/cli.js preprocessor.jison-lex --outfile preprocessor.js
../../node_modules/jison/lib/cli.js orchestra.jison orchestra.jison-lex --outfile orchestra-parser.js
jasmine

to generate parsers and run some tests with Jison 0.4.18-154, I get one failure.

The failure appears to have something to do with using this grammar ruleβ€”

then_statement
  : THEN NEWLINE statements
    {
      $$ = new Then(@$, {children: $3});
    }
  | THEN NEWLINE
    {
      $$ = new Then(@$);
    }
  | THEN error
    {
      $$ = new Then(@$);
      parser.messages.push({
        type: 'Error',
        text: 'Expected newline',
        range: parser.lexer.rangeFromPosition(@1.last_line, @1.last_column)
      });
    }
  ;

β€”to parse this snippet:

      if 1 == 1 then + -
      endif

In Jison 0.4.18-154, it seems like an extra exception is thrown due to the error token in this grammar rule:

primary_expression
  : identifier
  | global_value_identifier
  | constant
  | '(' conditional_expression ')'
    {
      $$ = $2;
    }
  | error
    {
      parser.parseError('', {}, {
        type: 'Error',
        text: 'Expected expression',
        range: parser.lexer.rangeFromPosition(@$.first_line, @$.first_column)
      });
    }
  ;

Any ideas?

Cannot find module 'xregexp'

Since yesterday I get the error message: "Cannot find module 'xregexp'", when using the parser generator.
A bit of investigation showed that the xregexp dependency points to the master of a forked xregexp lib, that has been changed in the last hours.

I think we have two issues here:

  1. The reported problem with xregexp.
  2. In general it is bad practice to point to a dependency master. In our project, we point at a specific version (0.4.18-180) of jison-gho, to make sure that things don't break overnight. However, this happened yesterday, as your sub dependencies changed.

Fix issues which make jison / jison-lex / ... tests fail when run under istanbul/nyc code coverage

Code instrumentation for coverage analysis causes the generated parsers/lexers/misc code to fail, hence the mocha test suits fail dramatically when run under istanbul/nyc code coverage.

Guess: This is probably related to #7 as this is yet another code rewriting issue with the jison tool itself. Trouble is very probably due to code being stringified, then edited, then fed back into eval()/new Function()/similar 'live' code generation constructs.

Use
https://github.com/gotwarlost/istanbul/blob/master/ignoring-code-for-coverage.md as a hint list too as the jison core should be 'text-rather-than-code' fixed, while unit tests injecting action code as JavaScript functions, etc. can be nudged into cooperation by using the /* istanbul ignore ... */ marker documented in the URL https://github.com/gotwarlost/istanbul/blob/master/ignoring-code-for-coverage.md .

Question: jison-gho debugger

Hi! I wonder if there's a public url with the jison debugger but using jison-gho?

I noticed there's a forked jison debugger repo with jison-gho but I've spent a few hours
trying to build it with no success. (Managed to build it but when running it I get a
ReferenceError: bnf is not defined
at eval (eval at compileGrammar (grammar-worker.js:14), :1:1)

Thanks!

examples/test-ll2-grammar-1.jison appears to have syntax errors

Attempting to create a JavaScript file from examples/test-ll2-grammar-1.jison using Jison 0.4.18-180 results in:

/…/jison-gho/lib/jison.js:192
            throw err;
            ^

JisonParserError: Parse error on line 55: 
S β†’ G;
--^
Expecting ":", got unexpected "ARROW_ACTION"
    at Parser.parseError (/…/jison-gho/lib/util/parser.js:2464:15)
    at Parser.parse (/…/jison-gho/lib/util/parser.js:2800:34)
    at Object.parse (/…/jison-gho/lib/util/ebnf-parser.js:6:16)
    at autodetectAndConvertToJSONformat (/…/jison-gho/lib/jison.js:182:32)
    at new Jison_Generator (/…/jison-gho/lib/jison.js:6605:15)
    at Object.generateParserString (/…/jison-gho/lib/cli.js:325:21)
    at processInputFile (/…/jison-gho/lib/cli.js:262:26)
    at Object.cliMain [as main] (/…/jison-gho/lib/cli.js:314:9)
    at Object.<anonymous> (/…/jison-gho/lib/cli.js:340:9)
    at Module._compile (module.js:571:32)

(I made the file paths shorter for clarity’s sake.)

Compile/test ES6/TypeScript/??? action code in jison grammars?

Sideways related to #14: here the challenge is setting up jison so that its internal code generation, code validation and 'live parser' generation logic can cope (= precompile) sources which are not 'vanilla JS', i.e. which are not JavaScript which is supported out of the box on the platform you're currently running jison on.

migrate towards using a monorepo: ES6/rollup/etc. is a horror otherwise

See also: https://github.com/GerHobbelt/jison/projects/3

Now that I am upgrading jison et al to produce ES6/ES2015 compatible output and be ES6/ES2015 itself (at least as far as modules' import/export is concerned), the split into submodules proves to be a real horror as it is NOT a dependency tree but rather a (cyclic!) dependency graph at development time.

In short: πŸ’€ πŸ’’ 😭 πŸ‘Ώ πŸ‘Ž πŸ‘Ž πŸ‘Ž

jison-gho on NPM contains unrelated files

Ohai,

jison-gho (0.6.1-215) on NPM contains (IMHO) unrelated files, that account for about of 21MB of data.
They are all in the tmp2 folder:

tmp2
└── sandbox
    β”œβ”€β”€ Coding the matrix
    β”‚   β”œβ”€β”€ l01_fields
    β”‚   β”‚   β”œβ”€β”€ 01 - fields.ipynb
    β”‚   β”‚   β”œβ”€β”€ __init__.py
    β”‚   β”‚   β”œβ”€β”€ inverse_index_lab.py
    β”‚   β”‚   β”œβ”€β”€ profile.txt
    β”‚   β”‚   β”œβ”€β”€ python_lab.py
    β”‚   β”‚   β”œβ”€β”€ stories_big.txt
    β”‚   β”‚   β”œβ”€β”€ stories_small.txt
    β”‚   β”‚   β”œβ”€β”€ submit.py
    β”‚   β”‚   β”œβ”€β”€ The_Field_problems.py
    β”‚   β”‚   └── The_Function_problems.py
    β”‚   β”œβ”€β”€ l02_vectors
    β”‚   β”‚   β”œβ”€β”€ 02 - vectors.ipynb
    β”‚   β”‚   β”œβ”€β”€ GF2.py
    β”‚   β”‚   β”œβ”€β”€ __init__.py
    β”‚   β”‚   β”œβ”€β”€ politics_lab.py
    β”‚   β”‚   β”œβ”€β”€ profile.txt
    β”‚   β”‚   β”œβ”€β”€ submit.py
    β”‚   β”‚   β”œβ”€β”€ test_vec.py
    β”‚   β”‚   β”œβ”€β”€ The_Vector_problems.py
    β”‚   β”‚   β”œβ”€β”€ The_Vector_Space_problems.py
    β”‚   β”‚   β”œβ”€β”€ UN_voting_data.txt
    β”‚   β”‚   β”œβ”€β”€ US_Senate_voting_data_109.txt
    β”‚   β”‚   β”œβ”€β”€ vec.py
    β”‚   β”‚   β”œβ”€β”€ vecutil.py
    β”‚   β”‚   └── voting_record_dump109.txt
    β”‚   β”œβ”€β”€ l03_matrix
    β”‚   β”‚   β”œβ”€β”€ bitutil.py
    β”‚   β”‚   β”œβ”€β”€ cit.png
    β”‚   β”‚   β”œβ”€β”€ ecc_lab.py
    β”‚   β”‚   β”œβ”€β”€ geometry_lab.py
    β”‚   β”‚   β”œβ”€β”€ __init__.py
    β”‚   β”‚   β”œβ”€β”€ l03_matrix.ipynb
    β”‚   β”‚   β”œβ”€β”€ mat.py
    β”‚   β”‚   β”œβ”€β”€ mat_sparsity.py
    β”‚   β”‚   β”œβ”€β”€ profile.txt
    β”‚   β”‚   β”œβ”€β”€ submit.py
    β”‚   β”‚   β”œβ”€β”€ The_Matrix_problems.py
    β”‚   β”‚   └── UN_voting_data.txt
    β”‚   β”œβ”€β”€ l04_basis
    β”‚   β”‚   β”œβ”€β”€ board.png
    β”‚   β”‚   β”œβ”€β”€ cit.png
    β”‚   β”‚   β”œβ”€β”€ __init__.py
    β”‚   β”‚   β”œβ”€β”€ l04_basis.ipynb
    β”‚   β”‚   β”œβ”€β”€ perspective_lab.py
    β”‚   β”‚   β”œβ”€β”€ profile.txt
    β”‚   β”‚   β”œβ”€β”€ submit.py
    β”‚   β”‚   └── The_Basis_problems.py
    β”‚   β”œβ”€β”€ l05_dimension
    β”‚   β”‚   β”œβ”€β”€ Dimension_problems.py
    β”‚   β”‚   β”œβ”€β”€ independence.py
    β”‚   β”‚   β”œβ”€β”€ l05_dimension.ipynb
    β”‚   β”‚   └── triangular.py
    β”‚   └── matlib
    β”‚       β”œβ”€β”€ basutil.py
    β”‚       β”œβ”€β”€ bitutil.py
    β”‚       β”œβ”€β”€ dictutil.py
    β”‚       β”œβ”€β”€ GF2.py
    β”‚       β”œβ”€β”€ image_mat_util.py
    β”‚       β”œβ”€β”€ image.py
    β”‚       β”œβ”€β”€ img01.png
    β”‚       β”œβ”€β”€ __init__.py
    β”‚       β”œβ”€β”€ mat.py
    β”‚       β”œβ”€β”€ matutil.py
    β”‚       β”œβ”€β”€ plotting.py
    β”‚       β”œβ”€β”€ png.py
    β”‚       β”œβ”€β”€ solver.py
    β”‚       β”œβ”€β”€ vec.py
    β”‚       └── vecutil.py
    β”œβ”€β”€ HTML
    β”‚   β”œβ”€β”€ [Habr] Single-page layout 1 (simple)
    β”‚   β”‚   β”œβ”€β”€ apple-touch-icon.png
    β”‚   β”‚   β”œβ”€β”€ css
    β”‚   β”‚   β”‚   β”œβ”€β”€ main.css
    β”‚   β”‚   β”‚   β”œβ”€β”€ normalize.css
    β”‚   β”‚   β”‚   └── styles.css
    β”‚   β”‚   β”œβ”€β”€ doc
    β”‚   β”‚   β”‚   β”œβ”€β”€ css.md
    β”‚   β”‚   β”‚   β”œβ”€β”€ extend.md
    β”‚   β”‚   β”‚   β”œβ”€β”€ faq.md
    β”‚   β”‚   β”‚   β”œβ”€β”€ html.md
    β”‚   β”‚   β”‚   β”œβ”€β”€ js.md
    β”‚   β”‚   β”‚   β”œβ”€β”€ misc.md
    β”‚   β”‚   β”‚   β”œβ”€β”€ TOC.md
    β”‚   β”‚   β”‚   └── usage.md
    β”‚   β”‚   β”œβ”€β”€ favicon.ico
    β”‚   β”‚   β”œβ”€β”€ habr-html-1.sublime-project
    β”‚   β”‚   β”œβ”€β”€ habr-html-1.sublime-workspace
    β”‚   β”‚   β”œβ”€β”€ img
    β”‚   β”‚   β”‚   β”œβ”€β”€ bg.png
    β”‚   β”‚   β”‚   β”œβ”€β”€ footer-logo.png
    β”‚   β”‚   β”‚   β”œβ”€β”€ h1-bg.png
    β”‚   β”‚   β”‚   β”œβ”€β”€ logo.png
    β”‚   β”‚   β”‚   β”œβ”€β”€ sample.png
    β”‚   β”‚   β”‚   β”œβ”€β”€ social.png
    β”‚   β”‚   β”‚   └── social-small.png
    β”‚   β”‚   β”œβ”€β”€ index.html
    β”‚   β”‚   β”œβ”€β”€ js
    β”‚   β”‚   β”‚   β”œβ”€β”€ main.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ plugins.js
    β”‚   β”‚   β”‚   └── vendor
    β”‚   β”‚   β”‚       β”œβ”€β”€ jquery-1.12.0.min.js
    β”‚   β”‚   β”‚       └── modernizr-2.8.3.min.js
    β”‚   β”‚   └── README.md
    β”‚   β”œβ”€β”€ [Habr] Single-page layout 2 (bootstrap)
    β”‚   β”‚   β”œβ”€β”€ bower.json
    β”‚   β”‚   β”œβ”€β”€ gruntfile.js
    β”‚   β”‚   β”œβ”€β”€ less
    β”‚   β”‚   β”‚   β”œβ”€β”€ styles.less
    β”‚   β”‚   β”‚   └── variables.less
    β”‚   β”‚   β”œβ”€β”€ package.json
    β”‚   β”‚   β”œβ”€β”€ README.md
    β”‚   β”‚   β”œβ”€β”€ test.sublime-project
    β”‚   β”‚   β”œβ”€β”€ test.sublime-workspace
    β”‚   β”‚   └── www
    β”‚   β”‚       β”œβ”€β”€ css
    β”‚   β”‚       β”‚   β”œβ”€β”€ bootstrap.css
    β”‚   β”‚       β”‚   β”œβ”€β”€ main.css
    β”‚   β”‚       β”‚   └── normalize.css
    β”‚   β”‚       β”œβ”€β”€ fonts
    β”‚   β”‚       β”‚   β”œβ”€β”€ glyphicons-halflings-regular.eot
    β”‚   β”‚       β”‚   β”œβ”€β”€ glyphicons-halflings-regular.svg
    β”‚   β”‚       β”‚   β”œβ”€β”€ glyphicons-halflings-regular.ttf
    β”‚   β”‚       β”‚   β”œβ”€β”€ glyphicons-halflings-regular.woff
    β”‚   β”‚       β”‚   └── glyphicons-halflings-regular.woff2
    β”‚   β”‚       β”œβ”€β”€ img
    β”‚   β”‚       β”‚   β”œβ”€β”€ about-1.png
    β”‚   β”‚       β”‚   β”œβ”€β”€ about-2.png
    β”‚   β”‚       β”‚   β”œβ”€β”€ bg.png
    β”‚   β”‚       β”‚   β”œβ”€β”€ footer-logo.png
    β”‚   β”‚       β”‚   β”œβ”€β”€ h1-bg.png
    β”‚   β”‚       β”‚   β”œβ”€β”€ logo.png
    β”‚   β”‚       β”‚   β”œβ”€β”€ map.png
    β”‚   β”‚       β”‚   β”œβ”€β”€ sample.png
    β”‚   β”‚       β”‚   β”œβ”€β”€ social.png
    β”‚   β”‚       β”‚   β”œβ”€β”€ social-small.png
    β”‚   β”‚       β”‚   β”œβ”€β”€ source
    β”‚   β”‚       β”‚   β”‚   β”œβ”€β”€ About.psd
    β”‚   β”‚       β”‚   β”‚   β”œβ”€β”€ Homepage.psd
    β”‚   β”‚       β”‚   β”‚   β”œβ”€β”€ Projects.psd
    β”‚   β”‚       β”‚   β”‚   └── Read Me.txt
    β”‚   β”‚       β”‚   └── team
    β”‚   β”‚       β”‚       β”œβ”€β”€ Brunton.jpg
    β”‚   β”‚       β”‚       β”œβ”€β”€ Doe.jpg
    β”‚   β”‚       β”‚       β”œβ”€β”€ Nobriga.jpg
    β”‚   β”‚       β”‚       β”œβ”€β”€ Pittsley.jpg
    β”‚   β”‚       β”‚       β”œβ”€β”€ Rousselle.jpg
    β”‚   β”‚       β”‚       β”œβ”€β”€ Shoff.jpg
    β”‚   β”‚       β”‚       β”œβ”€β”€ Simser.jpg
    β”‚   β”‚       β”‚       β”œβ”€β”€ Tondrea.jpg
    β”‚   β”‚       β”‚       β”œβ”€β”€ Venuti.jpg
    β”‚   β”‚       β”‚       └── Wollman.jpg
    β”‚   β”‚       β”œβ”€β”€ index.html
    β”‚   β”‚       └── js
    β”‚   β”‚           β”œβ”€β”€ main.js
    β”‚   β”‚           β”œβ”€β”€ plugins.js
    β”‚   β”‚           └── vendor
    β”‚   β”‚               β”œβ”€β”€ bootstrap.min.js
    β”‚   β”‚               β”œβ”€β”€ jquery-1.12.0.min.js
    β”‚   β”‚               └── modernizr-2.8.3.min.js
    β”‚   └── [pluralsight] Bootstrap
    β”‚       β”œβ”€β”€ bootstrap.sublime-project
    β”‚       β”œβ”€β”€ bootstrap.sublime-workspace
    β”‚       β”œβ”€β”€ Gruntfile.js
    β”‚       β”œβ”€β”€ package.json
    β”‚       β”œβ”€β”€ README.md
    β”‚       β”œβ”€β”€ server.js
    β”‚       └── www
    β”‚           β”œβ”€β”€ About.html
    β”‚           β”œβ”€β”€ Contact.html
    β”‚           β”œβ”€β”€ css
    β”‚           β”‚   β”œβ”€β”€ amelia-bootstrap.min.css
    β”‚           β”‚   β”œβ”€β”€ bootstrap.css
    β”‚           β”‚   β”œβ”€β”€ bootstrap.css.map
    β”‚           β”‚   β”œβ”€β”€ bootstrap.min.css
    β”‚           β”‚   β”œβ”€β”€ bootstrap.min.css.map
    β”‚           β”‚   β”œβ”€β”€ bootstrap-theme.css
    β”‚           β”‚   β”œβ”€β”€ bootstrap-theme.css.map
    β”‚           β”‚   β”œβ”€β”€ bootstrap-theme.min.css
    β”‚           β”‚   β”œβ”€β”€ bootstrap-theme.min.css.map
    β”‚           β”‚   └── site.css
    β”‚           β”œβ”€β”€ fonts
    β”‚           β”‚   β”œβ”€β”€ glyphicons-halflings-regular.eot
    β”‚           β”‚   β”œβ”€β”€ glyphicons-halflings-regular.svg
    β”‚           β”‚   β”œβ”€β”€ glyphicons-halflings-regular.ttf
    β”‚           β”‚   β”œβ”€β”€ glyphicons-halflings-regular.woff
    β”‚           β”‚   └── glyphicons-halflings-regular.woff2
    β”‚           β”œβ”€β”€ images
    β”‚           β”‚   β”œβ”€β”€ carousel-1.jpg
    β”‚           β”‚   β”œβ”€β”€ carousel-2.jpg
    β”‚           β”‚   β”œβ”€β”€ carousel-3.jpg
    β”‚           β”‚   β”œβ”€β”€ carousel-4.jpg
    β”‚           β”‚   β”œβ”€β”€ lebowski-1.jpg
    β”‚           β”‚   β”œβ”€β”€ lebowski-2.jpg
    β”‚           β”‚   └── lebowski-3.jpg
    β”‚           β”œβ”€β”€ Index.html
    β”‚           β”œβ”€β”€ js
    β”‚           β”‚   β”œβ”€β”€ bootstrap.js
    β”‚           β”‚   β”œβ”€β”€ bootstrap.min.js
    β”‚           β”‚   β”œβ”€β”€ jquery-2.0.3.js
    β”‚           β”‚   β”œβ”€β”€ jquery-2.0.3.min.js
    β”‚           β”‚   β”œβ”€β”€ npm.js
    β”‚           β”‚   └── site.js
    β”‚           └── test.html
    β”œβ”€β”€ JS
    β”‚   └── dnd
    β”‚       β”œβ”€β”€ binDropZone.js
    β”‚       β”œβ”€β”€ binDropZoneWrapper.js
    β”‚       β”œβ”€β”€ dnd.css
    β”‚       β”œβ”€β”€ dragManager.css
    β”‚       β”œβ”€β”€ dragManager.js
    β”‚       β”œβ”€β”€ dragObject.js
    β”‚       β”œβ”€β”€ dragZone.js
    β”‚       β”œβ”€β”€ dragZoneWrapper.js
    β”‚       β”œβ”€β”€ dropZone.js
    β”‚       β”œβ”€β”€ dropZoneWrapper.js
    β”‚       β”œβ”€β”€ helpers.js
    β”‚       β”œβ”€β”€ img
    β”‚       β”‚   β”œβ”€β”€ JS_DnD_Activity_Diagram.png
    β”‚       β”‚   └── JS_DnD_Collaboration_Diagram.png
    β”‚       β”œβ”€β”€ index.html
    β”‚       β”œβ”€β”€ README.md
    β”‚       β”œβ”€β”€ treeDragObject.js
    β”‚       β”œβ”€β”€ treeDragZone.js
    β”‚       └── treeDropZone.js
    β”œβ”€β”€ [Mail.ru] Frontend
    β”‚   β”œβ”€β”€ Gruntfile.js
    β”‚   β”œβ”€β”€ package.json
    β”‚   β”œβ”€β”€ public_html
    β”‚   β”‚   β”œβ”€β”€ css
    β”‚   β”‚   β”‚   β”œβ”€β”€ fonts
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ glyphicons-halflings-regular.eot
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ glyphicons-halflings-regular.svg
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ glyphicons-halflings-regular.ttf
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ glyphicons-halflings-regular.woff
    β”‚   β”‚   β”‚   β”‚   └── glyphicons-halflings-regular.woff2
    β”‚   β”‚   β”‚   β”œβ”€β”€ main.css
    β”‚   β”‚   β”‚   └── vendor
    β”‚   β”‚   β”‚       β”œβ”€β”€ bootstrap.min.css
    β”‚   β”‚   β”‚       └── bootstrap-theme.min.css
    β”‚   β”‚   β”œβ”€β”€ index.html
    β”‚   β”‚   β”œβ”€β”€ js
    β”‚   β”‚   β”‚   β”œβ”€β”€ app
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ index.js
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ sandbox.js
    β”‚   β”‚   β”‚   β”‚   └── sandbox_tmpl
    β”‚   β”‚   β”‚   β”‚       └── player.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ collections
    β”‚   β”‚   β”‚   β”‚   └── users.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ main.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ models
    β”‚   β”‚   β”‚   β”‚   └── user.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ router.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ tmpl
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ game.js
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ gamelist.js
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ login.js
    β”‚   β”‚   β”‚   β”‚   └── main.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ vendor
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ backbone.js
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ bootstrap.js
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ jquery.js
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ require.js
    β”‚   β”‚   β”‚   β”‚   └── underscore.js
    β”‚   β”‚   β”‚   └── views
    β”‚   β”‚   β”‚       β”œβ”€β”€ game.js
    β”‚   β”‚   β”‚       β”œβ”€β”€ gamelist.js
    β”‚   β”‚   β”‚       β”œβ”€β”€ login.js
    β”‚   β”‚   β”‚       └── main.js
    β”‚   β”‚   └── tests
    β”‚   β”‚       β”œβ”€β”€ index.html
    β”‚   β”‚       β”œβ”€β”€ qunit.css
    β”‚   β”‚       β”œβ”€β”€ qunit.js
    β”‚   β”‚       └── tests.js
    β”‚   β”œβ”€β”€ README.md
    β”‚   β”œβ”€β”€ _sandbox
    β”‚   β”‚   β”œβ”€β”€ css
    β”‚   β”‚   β”‚   β”œβ”€β”€ main.css
    β”‚   β”‚   β”‚   └── normalize.css
    β”‚   β”‚   β”œβ”€β”€ index.html
    β”‚   β”‚   └── js
    β”‚   β”‚       β”œβ”€β”€ main.js
    β”‚   β”‚       β”œβ”€β”€ plugins.js
    β”‚   β”‚       └── vendor
    β”‚   β”‚           β”œβ”€β”€ jquery-1.12.0.min.js
    β”‚   β”‚           └── modernizr-2.8.3.min.js
    β”‚   β”œβ”€β”€ server.js
    β”‚   β”œβ”€β”€ server_tml
    β”‚   β”‚   └── authresponse.txt
    β”‚   └── templates
    β”‚       β”œβ”€β”€ gamelist.xml
    β”‚       β”œβ”€β”€ game.xml
    β”‚       β”œβ”€β”€ login.xml
    β”‚       β”œβ”€β”€ main.xml
    β”‚       └── sandbox
    β”‚           └── player.xml
    β”œβ”€β”€ [Mail.ru] Web Applications
    β”‚   β”œβ”€β”€ news_portal
    β”‚   β”‚   β”œβ”€β”€ comments
    β”‚   β”‚   β”‚   β”œβ”€β”€ admin.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ apps.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ forms.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ __init__.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ migrations
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0001_initial.py
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0002_auto_20160521_1546.py
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0003_auto_20160528_1639.py
    β”‚   β”‚   β”‚   β”‚   └── __init__.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ models.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ tests.py
    β”‚   β”‚   β”‚   └── views.py
    β”‚   β”‚   β”œβ”€β”€ config-cent.json
    β”‚   β”‚   β”œβ”€β”€ core
    β”‚   β”‚   β”‚   β”œβ”€β”€ admin.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ apps.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ forms.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ __init__.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ migrations
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0001_initial.py
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0002_auto_20160528_1639.py
    β”‚   β”‚   β”‚   β”‚   └── __init__.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ models.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ settings.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ static
    β”‚   β”‚   β”‚   β”‚   └── core
    β”‚   β”‚   β”‚   β”‚       β”œβ”€β”€ base.css
    β”‚   β”‚   β”‚   β”‚       β”œβ”€β”€ base.js
    β”‚   β”‚   β”‚   β”‚       β”œβ”€β”€ centrifuge.js
    β”‚   β”‚   β”‚   β”‚       β”œβ”€β”€ chosen.css
    β”‚   β”‚   β”‚   β”‚       β”œβ”€β”€ chosen.jquery.js
    β”‚   β”‚   β”‚   β”‚       β”œβ”€β”€ [email protected]
    β”‚   β”‚   β”‚   β”‚       β”œβ”€β”€ chosen-sprite.png
    β”‚   β”‚   β”‚   β”‚       └── Earth.png
    β”‚   β”‚   β”‚   β”œβ”€β”€ templates
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ admin
    β”‚   β”‚   β”‚   β”‚   β”‚   └── base_site.html
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ base.html
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ login.html
    β”‚   β”‚   β”‚   β”‚   └── signup.html
    β”‚   β”‚   β”‚   β”œβ”€β”€ urls.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ views.py
    β”‚   β”‚   β”‚   └── wsgi.py
    β”‚   β”‚   β”œβ”€β”€ db.sqlite3
    β”‚   β”‚   β”œβ”€β”€ environment.yml
    β”‚   β”‚   β”œβ”€β”€ hello
    β”‚   β”‚   β”‚   β”œβ”€β”€ admin.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ apps.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ __init__.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ migrations
    β”‚   β”‚   β”‚   β”‚   └── __init__.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ models.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ templates
    β”‚   β”‚   β”‚   β”‚   └── hello.html
    β”‚   β”‚   β”‚   β”œβ”€β”€ tests.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ urls.py
    β”‚   β”‚   β”‚   └── views.py
    β”‚   β”‚   β”œβ”€β”€ manage.py
    β”‚   β”‚   β”œβ”€β”€ news
    β”‚   β”‚   β”‚   β”œβ”€β”€ admin.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ apps.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ forms.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ __init__.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ management
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ commands
    β”‚   β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ __init__.py
    β”‚   β”‚   β”‚   β”‚   β”‚   └── recount_likes.py
    β”‚   β”‚   β”‚   β”‚   └── __init__.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ migrations
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0001_initial.py
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0002_auto_20160515_1856.py
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0003_article_rating.py
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0004_articlelike.py
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0005_auto_20160526_1046.py
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0006_articlelike_is_liked.py
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0007_auto_20160527_1004.py
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0008_article_is_published.py
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0009_auto_20160528_1124.py
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0010_auto_20160531_2141.py
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0011_auto_20160531_2201.py
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0012_auto_20160531_2202.py
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0013_auto_20160531_2252.py
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0014_auto_20160602_0912.py
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ 0015_auto_20160602_0940.py
    β”‚   β”‚   β”‚   β”‚   └── __init__.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ models.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ static
    β”‚   β”‚   β”‚   β”‚   └── news
    β”‚   β”‚   β”‚   β”‚       β”œβ”€β”€ comments_subscribe.js
    β”‚   β”‚   β”‚   β”‚       └── like_button.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ templates
    β”‚   β”‚   β”‚   β”‚   └── news
    β”‚   β”‚   β”‚   β”‚       β”œβ”€β”€ article.html
    β”‚   β”‚   β”‚   β”‚       β”œβ”€β”€ create.html
    β”‚   β”‚   β”‚   β”‚       β”œβ”€β”€ like_button.html
    β”‚   β”‚   β”‚   β”‚       └── list.html
    β”‚   β”‚   β”‚   β”œβ”€β”€ tests.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ urls.py
    β”‚   β”‚   β”‚   └── views.py
    β”‚   β”‚   β”œβ”€β”€ README.md
    β”‚   β”‚   β”œβ”€β”€ requirements.txt
    β”‚   β”‚   β”œβ”€β”€ techsupport
    β”‚   β”‚   β”‚   β”œβ”€β”€ admin.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ apps.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ forms.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ __init__.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ migrations
    β”‚   β”‚   β”‚   β”‚   └── __init__.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ models.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ templates
    β”‚   β”‚   β”‚   β”‚   └── techsupport
    β”‚   β”‚   β”‚   β”‚       └── report_error.html
    β”‚   β”‚   β”‚   β”œβ”€β”€ templatetags
    β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ form_extras.py
    β”‚   β”‚   β”‚   β”‚   └── __init__.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ tests.py
    β”‚   β”‚   β”‚   └── views.py
    β”‚   β”‚   └── tictactoe
    β”‚   β”‚       β”œβ”€β”€ admin.py
    β”‚   β”‚       β”œβ”€β”€ apps.py
    β”‚   β”‚       β”œβ”€β”€ forms.py
    β”‚   β”‚       β”œβ”€β”€ __init__.py
    β”‚   β”‚       β”œβ”€β”€ migrations
    β”‚   β”‚       β”‚   β”œβ”€β”€ 0001_initial.py
    β”‚   β”‚       β”‚   β”œβ”€β”€ 0002_auto_20160528_1654.py
    β”‚   β”‚       β”‚   β”œβ”€β”€ 0003_auto_20160528_1739.py
    β”‚   β”‚       β”‚   β”œβ”€β”€ 0004_auto_20160528_1848.py
    β”‚   β”‚       β”‚   β”œβ”€β”€ 0005_auto_20160528_1911.py
    β”‚   β”‚       β”‚   β”œβ”€β”€ 0006_auto_20160531_2141.py
    β”‚   β”‚       β”‚   └── __init__.py
    β”‚   β”‚       β”œβ”€β”€ models.py
    β”‚   β”‚       β”œβ”€β”€ templates
    β”‚   β”‚       β”‚   └── tictactoe
    β”‚   β”‚       β”‚       β”œβ”€β”€ accept_invitation.html
    β”‚   β”‚       β”‚       β”œβ”€β”€ game.html
    β”‚   β”‚       β”‚       β”œβ”€β”€ game_list_snippet.html
    β”‚   β”‚       β”‚       β”œβ”€β”€ invitation.html
    β”‚   β”‚       β”‚       β”œβ”€β”€ list_snippet.html
    β”‚   β”‚       β”‚       └── move.html
    β”‚   β”‚       β”œβ”€β”€ tests.py
    β”‚   β”‚       β”œβ”€β”€ urls.py
    β”‚   β”‚       └── views.py
    β”‚   └── tcp_servers
    β”‚       β”œβ”€β”€ async.py
    β”‚       β”œβ”€β”€ client.py
    β”‚       β”œβ”€β”€ fork.py
    β”‚       β”œβ”€β”€ __init__.py
    β”‚       β”œβ”€β”€ itportal.conf
    β”‚       β”œβ”€β”€ mipt.conf
    β”‚       β”œβ”€β”€ nginx.conf
    β”‚       β”œβ”€β”€ prefork.py
    β”‚       β”œβ”€β”€ simple_http.py
    β”‚       β”œβ”€β”€ simple.py
    β”‚       └── stackoverflow.conf
    β”œβ”€β”€ [MIPT] Machine Learning
    β”‚   β”œβ”€β”€ dataset.tsv
    β”‚   β”œβ”€β”€ example_koi_8.txt
    β”‚   β”œβ”€β”€ example_utf8.txt
    β”‚   β”œβ”€β”€ file_to_write_in.txt
    β”‚   β”œβ”€β”€ nbmerge.py
    β”‚   β”œβ”€β”€ second_file_for_write_in.txt
    β”‚   β”œβ”€β”€ updated_dataset.csv
    β”‚   β”œβ”€β”€ w1.ipynb
    β”‚   β”œβ”€β”€ w2.ipynb
    β”‚   β”œβ”€β”€ w3_matrix_operations.ipynb
    β”‚   β”œβ”€β”€ w3_vector_operations.ipynb
    β”‚   └── w4_stochastic_variables.ipynb
    β”œβ”€β”€ NodeJS
    β”‚   β”œβ”€β”€ ajax-test
    β”‚   β”‚   β”œβ”€β”€ fileUploader.js
    β”‚   β”‚   β”œβ”€β”€ index.html
    β”‚   β”‚   β”œβ”€β”€ load-digits.js
    β”‚   β”‚   β”œβ”€β”€ load-phones.js
    β”‚   β”‚   β”œβ”€β”€ package.json
    β”‚   β”‚   β”œβ”€β”€ phones.json
    β”‚   β”‚   β”œβ”€β”€ README.md
    β”‚   β”‚   β”œβ”€β”€ server.js
    β”‚   β”‚   β”œβ”€β”€ uploader.js
    β”‚   β”‚   └── upload.js
    β”‚   β”œβ”€β”€ express-test
    β”‚   β”‚   β”œβ”€β”€ app.js
    β”‚   β”‚   β”œβ”€β”€ config
    β”‚   β”‚   β”‚   β”œβ”€β”€ config.json
    β”‚   β”‚   β”‚   └── index.js
    β”‚   β”‚   β”œβ”€β”€ db.js
    β”‚   β”‚   β”œβ”€β”€ errors
    β”‚   β”‚   β”‚   └── index.js
    β”‚   β”‚   β”œβ”€β”€ libs
    β”‚   β”‚   β”‚   β”œβ”€β”€ index.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ logger.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ mongoose.js
    β”‚   β”‚   β”‚   └── sessionStore.js
    β”‚   β”‚   β”œβ”€β”€ middleware
    β”‚   β”‚   β”‚   β”œβ”€β”€ checkAuth.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ hitCounter.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ loadUser.js
    β”‚   β”‚   β”‚   └── sendHttpError.js
    β”‚   β”‚   β”œβ”€β”€ models
    β”‚   β”‚   β”‚   └── user.js
    β”‚   β”‚   β”œβ”€β”€ package.json
    β”‚   β”‚   β”œβ”€β”€ README.md
    β”‚   β”‚   β”œβ”€β”€ routes
    β”‚   β”‚   β”‚   β”œβ”€β”€ chat.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ index.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ login.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ logout.js
    β”‚   β”‚   β”‚   └── users.js
    β”‚   β”‚   β”œβ”€β”€ scripts
    β”‚   β”‚   β”‚   β”œβ”€β”€ async.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ mongodb-test.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ mongoose-test.js
    β”‚   β”‚   β”‚   └── mongoose-test-v2.js
    β”‚   β”‚   β”œβ”€β”€ socket
    β”‚   β”‚   β”‚   └── index.js
    β”‚   β”‚   β”œβ”€β”€ views
    β”‚   β”‚   β”‚   β”œβ”€β”€ chat.ejs
    β”‚   β”‚   β”‚   β”œβ”€β”€ error.ejs
    β”‚   β”‚   β”‚   β”œβ”€β”€ index.ejs
    β”‚   β”‚   β”‚   β”œβ”€β”€ layout
    β”‚   β”‚   β”‚   β”‚   └── page.ejs
    β”‚   β”‚   β”‚   β”œβ”€β”€ login.ejs
    β”‚   β”‚   β”‚   └── partials
    β”‚   β”‚   β”‚       └── nav.ejs
    β”‚   β”‚   └── webapp
    β”‚   β”‚       β”œβ”€β”€ css
    β”‚   β”‚       β”‚   └── app.css
    β”‚   β”‚       └── js
    β”‚   β”‚           β”œβ”€β”€ chat.js
    β”‚   β”‚           β”œβ”€β”€ login.js
    β”‚   β”‚           └── logout.js
    β”‚   β”œβ”€β”€ longpoll-test
    β”‚   β”‚   β”œβ”€β”€ browser.js
    β”‚   β”‚   β”œβ”€β”€ index.html
    β”‚   β”‚   └── server.js
    β”‚   β”œβ”€β”€ mocha-test
    β”‚   β”‚   β”œβ”€β”€ browser
    β”‚   β”‚   β”‚   β”œβ”€β”€ index.html
    β”‚   β”‚   β”‚   β”œβ”€β”€ mocha.css
    β”‚   β”‚   β”‚   β”œβ”€β”€ mocha.js
    β”‚   β”‚   β”‚   └── tests.js
    β”‚   β”‚   β”œβ”€β”€ config.json
    β”‚   β”‚   β”œβ”€β”€ gulpfile.js
    β”‚   β”‚   β”œβ”€β”€ lib
    β”‚   β”‚   β”‚   β”œβ”€β”€ getPalette.js
    β”‚   β”‚   β”‚   └── hex2rgb.js
    β”‚   β”‚   β”œβ”€β”€ package.json
    β”‚   β”‚   β”œβ”€β”€ README.md
    β”‚   β”‚   β”œβ”€β”€ server.js
    β”‚   β”‚   β”œβ”€β”€ test
    β”‚   β”‚   β”‚   β”œβ”€β”€ fixtures
    β”‚   β”‚   β”‚   β”‚   └── config-not-array.json
    β”‚   β”‚   β”‚   β”œβ”€β”€ getPalette.test.bad.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ getPalette.test.expect.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ getPalette.test.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ getPalette.test.should.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ hex2rgb.test.expect.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ hex2rgb.test.js
    β”‚   β”‚   β”‚   β”œβ”€β”€ hex2rgb.test.should.js
    β”‚   β”‚   β”‚   └── mocha.opts
    β”‚   β”‚   └── views
    β”‚   β”‚       └── index.jade
    β”‚   └── nodejs-test
    β”‚       β”œβ”€β”€ chat-server.js
    β”‚       β”œβ”€β”€ demo
    β”‚       β”‚   β”œβ”€β”€ async.js
    β”‚       β”‚   β”œβ”€β”€ console.js
    β”‚       β”‚   β”œβ”€β”€ ee.js
    β”‚       β”‚   β”œβ”€β”€ error.js
    β”‚       β”‚   β”œβ”€β”€ leak.js
    β”‚       β”‚   β”œβ”€β”€ modules
    β”‚       β”‚   β”‚   β”œβ”€β”€ app.js
    β”‚       β”‚   β”‚   β”œβ”€β”€ db
    β”‚       β”‚   β”‚   β”‚   β”œβ”€β”€ index.js
    β”‚       β”‚   β”‚   β”‚   └── ru.json
    β”‚       β”‚   β”‚   β”œβ”€β”€ logger.js
    β”‚       β”‚   β”‚   β”œβ”€β”€ server.js
    β”‚       β”‚   β”‚   └── user
    β”‚       β”‚   β”‚       └── index.js
    β”‚       β”‚   β”œβ”€β”€ pow.js
    β”‚       β”‚   β”œβ”€β”€ streams.js
    β”‚       β”‚   └── util.js
    β”‚       β”œβ”€β”€ echo.js
    β”‚       β”œβ”€β”€ package.json
    β”‚       β”œβ”€β”€ public
    β”‚       β”‚   β”œβ”€β”€ index.html
    β”‚       β”‚   └── long-poll.js
    β”‚       β”œβ”€β”€ README.md
    β”‚       β”œβ”€β”€ server.js
    β”‚       └── serve-static.js
    β”œβ”€β”€ [pluralsight] Python Beyond the Basics
    β”‚   β”œβ”€β”€ 01 - Packaging.ipynb
    β”‚   β”œβ”€β”€ reader
    β”‚   β”‚   β”œβ”€β”€ compressed
    β”‚   β”‚   β”‚   β”œβ”€β”€ bzipped.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ gzipped.py
    β”‚   β”‚   β”‚   β”œβ”€β”€ __init__.py
    β”‚   β”‚   β”‚   └── __main__.py
    β”‚   β”‚   β”œβ”€β”€ __init__.py
    β”‚   β”‚   └── reader.py
    β”‚   β”œβ”€β”€ reader_exec
    β”‚   β”‚   β”œβ”€β”€ __main__.py
    β”‚   β”‚   └── reader
    β”‚   β”‚       β”œβ”€β”€ compressed
    β”‚   β”‚       β”‚   β”œβ”€β”€ bzipped.py
    β”‚   β”‚       β”‚   β”œβ”€β”€ gzipped.py
    β”‚   β”‚       β”‚   β”œβ”€β”€ __init__.py
    β”‚   β”‚       β”‚   └── __main__.py
    β”‚   β”‚       β”œβ”€β”€ __init__.py
    β”‚   β”‚       └── reader.py
    β”‚   └── test.bz2
    β”œβ”€β”€ [pluralsight] Python fundamentals
    β”‚   β”œβ”€β”€ 01-05 - Basics.ipynb
    β”‚   β”œβ”€β”€ 03,06 - strings and collections.ipynb
    β”‚   β”œβ”€β”€ 07 - exception handling.ipynb
    β”‚   β”œβ”€β”€ 08 - Iterables.ipynb
    β”‚   β”œβ”€β”€ 09 - classes
    β”‚   β”‚   └── airtravel.py
    β”‚   β”œβ”€β”€ 10 - File and resource management
    β”‚   β”‚   β”œβ”€β”€ 10 - File and Resource management.ipynb
    β”‚   β”‚   β”œβ”€β”€ bmp.py
    β”‚   β”‚   β”œβ”€β”€ files.py
    β”‚   β”‚   β”œβ”€β”€ fractal.py
    β”‚   β”‚   β”œβ”€β”€ fridge.py
    β”‚   β”‚   β”œβ”€β”€ recaman.dat
    β”‚   β”‚   β”œβ”€β”€ recaman.py
    β”‚   β”‚   β”œβ”€β”€ series.py
    β”‚   β”‚   └── wasteland.txt
    β”‚   β”œβ”€β”€ 11 - Testing and Shipping
    β”‚   β”‚   β”œβ”€β”€ palindrome
    β”‚   β”‚   β”‚   β”œβ”€β”€ palindrome.py
    β”‚   β”‚   β”‚   └── setup.py
    β”‚   β”‚   └── text_analyzer.py
    β”‚   β”œβ”€β”€ exceptional.py
    β”‚   β”œβ”€β”€ generator.py
    β”‚   β”œβ”€β”€ roots.py
    β”‚   └── words.py
    └── [Stanford] Compilers
        β”œβ”€β”€ assignments
        β”‚   β”œβ”€β”€ PA2
        β”‚   β”‚   β”œβ”€β”€ cool.flex
        β”‚   β”‚   β”œβ”€β”€ cool-lex.cc
        β”‚   β”‚   β”œβ”€β”€ cool-lex.d
        β”‚   β”‚   β”œβ”€β”€ escapednull.cool
        β”‚   β”‚   β”œβ”€β”€ escapednull.out
        β”‚   β”‚   β”œβ”€β”€ handle_flags.cc
        β”‚   β”‚   β”œβ”€β”€ handle_flags.d
        β”‚   β”‚   β”œβ”€β”€ hello_world.cl
        β”‚   β”‚   β”œβ”€β”€ lextest.cc
        β”‚   β”‚   β”œβ”€β”€ lextest.d
        β”‚   β”‚   β”œβ”€β”€ mycoolc
        β”‚   β”‚   β”œβ”€β”€ octal.cool
        β”‚   β”‚   β”œβ”€β”€ octal.out
        β”‚   β”‚   β”œβ”€β”€ pa1-grading.pl
        β”‚   β”‚   β”œβ”€β”€ README
        β”‚   β”‚   β”œβ”€β”€ stringtab.cc
        β”‚   β”‚   β”œβ”€β”€ stringtab.d
        β”‚   β”‚   β”œβ”€β”€ test.cl
        β”‚   β”‚   β”œβ”€β”€ test.err
        β”‚   β”‚   β”œβ”€β”€ test.out
        β”‚   β”‚   β”œβ”€β”€ utilities.cc
        β”‚   β”‚   └── utilities.d
        β”‚   β”œβ”€β”€ PA3
        β”‚   β”‚   β”œβ”€β”€ bad.cl
        β”‚   β”‚   β”œβ”€β”€ bison-test
        β”‚   β”‚   β”‚   β”œβ”€β”€ calc
        β”‚   β”‚   β”‚   β”œβ”€β”€ calc_simple.tab.c
        β”‚   β”‚   β”‚   β”œβ”€β”€ calc_simple.y
        β”‚   β”‚   β”‚   β”œβ”€β”€ calc.tab.c
        β”‚   β”‚   β”‚   β”œβ”€β”€ calc.y
        β”‚   β”‚   β”‚   β”œβ”€β”€ mfcalc
        β”‚   β”‚   β”‚   β”œβ”€β”€ mfcalc.c
        β”‚   β”‚   β”‚   β”œβ”€β”€ mfcalc.h
        β”‚   β”‚   β”‚   β”œβ”€β”€ mfcalc.tab.c
        β”‚   β”‚   β”‚   β”œβ”€β”€ mfcalc.tab.h
        β”‚   β”‚   β”‚   β”œβ”€β”€ mfcalc.y
        β”‚   β”‚   β”‚   β”œβ”€β”€ non-slr.tab.c
        β”‚   β”‚   β”‚   β”œβ”€β”€ non-slr.y
        β”‚   β”‚   β”‚   β”œβ”€β”€ rpcalc
        β”‚   β”‚   β”‚   β”œβ”€β”€ rpcalc.tab.c
        β”‚   β”‚   β”‚   β”œβ”€β”€ rpcalc.y
        β”‚   β”‚   β”‚   β”œβ”€β”€ simple.tab.c
        β”‚   β”‚   β”‚   └── simple.y
        β”‚   β”‚   β”œβ”€β”€ cool-parse.cc
        β”‚   β”‚   β”œβ”€β”€ cool-parse.d
        β”‚   β”‚   β”œβ”€β”€ cool.tab.h
        β”‚   β”‚   β”œβ”€β”€ cool-tree.cc
        β”‚   β”‚   β”œβ”€β”€ cool-tree.d
        β”‚   β”‚   β”œβ”€β”€ cool-tree.handcode.h
        β”‚   β”‚   β”œβ”€β”€ cool.y
        β”‚   β”‚   β”œβ”€β”€ dumptype.cc
        β”‚   β”‚   β”œβ”€β”€ dumptype.d
        β”‚   β”‚   β”œβ”€β”€ good.cl
        β”‚   β”‚   β”œβ”€β”€ handle_flags.cc
        β”‚   β”‚   β”œβ”€β”€ handle_flags.d
        β”‚   β”‚   β”œβ”€β”€ mycoolc
        β”‚   β”‚   β”œβ”€β”€ myparser
        β”‚   β”‚   β”œβ”€β”€ pa2-grading.pl
        β”‚   β”‚   β”œβ”€β”€ parser-phase.cc
        β”‚   β”‚   β”œβ”€β”€ parser-phase.d
        β”‚   β”‚   β”œβ”€β”€ README
        β”‚   β”‚   β”œβ”€β”€ stringtab.cc
        β”‚   β”‚   β”œβ”€β”€ stringtab.d
        β”‚   β”‚   β”œβ”€β”€ tokens-lex.cc
        β”‚   β”‚   β”œβ”€β”€ tokens-lex.d
        β”‚   β”‚   β”œβ”€β”€ tree.cc
        β”‚   β”‚   β”œβ”€β”€ tree.d
        β”‚   β”‚   β”œβ”€β”€ utilities.cc
        β”‚   β”‚   └── utilities.d
        β”‚   └── PA4
        β”‚       β”œβ”€β”€ q2.coolc
        β”‚       β”œβ”€β”€ q2.s
        β”‚       β”œβ”€β”€ q4.coolc
        β”‚       β”œβ”€β”€ q4.s
        β”‚       β”œβ”€β”€ q6.coolc
        β”‚       └── q6.s
        β”œβ”€β”€ cool-jison
        β”‚   β”œβ”€β”€ arith.cl
        β”‚   β”œβ”€β”€ bad2.cl
        β”‚   β”œβ”€β”€ badblock.cl
        β”‚   β”œβ”€β”€ bad.cl
        β”‚   β”œβ”€β”€ cool.flex
        β”‚   β”œβ”€β”€ cool-gho.js
        β”‚   β”œβ”€β”€ cool-gho-old.js
        β”‚   β”œβ”€β”€ cool.js
        β”‚   β”œβ”€β”€ cool_.js
        β”‚   β”œβ”€β”€ cool-lex.js
        β”‚   β”œβ”€β”€ cool-new.y
        β”‚   β”œβ”€β”€ cool.output2
        β”‚   β”œβ”€β”€ cool.y
        β”‚   β”œβ”€β”€ escapednull.cool
        β”‚   β”œβ”€β”€ escapedquote.cool
        β”‚   β”œβ”€β”€ firstclasserrored.cl
        β”‚   β”œβ”€β”€ hello_world.cl
        β”‚   β”œβ”€β”€ index.js
        β”‚   β”œβ”€β”€ invalidcharacters.cool
        β”‚   β”œβ”€β”€ life.cl
        β”‚   β”œβ”€β”€ null_in_code.cl.cool
        β”‚   β”œβ”€β”€ null_in_string.cl.cool
        β”‚   β”œβ”€β”€ null_in_string_followed_by_tokens.cl.cool
        β”‚   β”œβ”€β”€ package.json
        β”‚   └── README.md
        └── README.md

135 directories, 608 files

I think all these don't belong into the distributed package and should be removed.

FWIW, the sha256 of the package I got is:

5ead101834c35cc1dc89b3e545746390e828bdbde15506d1c92aa9c8101ec65e  jison-gho-0.6.1-215.tgz

Rename project, new identity.

Thanks for the great work.
Now that this fork is far ahead of the original project, it may be a good idea to give it its own identity. I could have easily missed this fork and assume it happens to a lot of people.

Result location @$ is undefined in v0.6.0-188

I’m not sure if I’m doing something wrong, but result locations @$ seem to be always undefined in v0.6.0-188. If I run

npm install https://github.com/GerHobbelt/jison/archive/0.6.0-188.tar.gz
cat <<'EOF' > test.jison
%lex
%%
. return 'CHARACTER';
/lex

%start characters

%%

characters
  : character
  | characters character
  ;

character: 'CHARACTER' { console.log(@$); };
EOF
node_modules/jison-gho/lib/cli.js test.jison
node --eval "require('$(pwd)/test.js').parse('abc')"

on macOS 10.12.6, the output is

undefined
undefined
undefined

part of #21: no concise error messages

Extracting another part from #21:

No concise error messages?

-t flag generates too much extra output, while the classical jison reported nice concise error messages.
classic jison:

$ node cool.js bad.cl
Parse error on line 12:
... type identifier *)Class b inherits A {
----------------------^
Expecting 'CLASS', got 'TYPEID'
C:\code\sandbox\[Stanford] Compilers\cool-jison\cool.js:394
                    throw new Error(errStr || 'Parsing halted while starting to recover from another error.');

jison-gho: too long output (impractical other than deep debugging)

I could not find a similar mechanism for error messaging in your fork (or it does not work on my grammar for some reason). I'd like to have a kind of concise error messages that used to be in classical jison in my final compiler.

Supplementary plane error due to a character I'm not actually using explicitly in my regex.

I have the following regex in my grammar file:

[\t\n\r\u0020-\uD7FF\uE000\uFFFD\u10000-\u10FFFF]

And I get the following error due to the regex above:

Error: You have Unicode Supplementary Plane content in a regex set: JavaScript has severe problems with Supplementary Plane content, particularly in regexes, so you are kindly required to get rid of this stuff. Sorry! (Offending UCS-2 code which triggered this: 0xd800)

I know the regex is the source of the error because if I remove it, then everything is fine. Specifically, the problem is \u0020-\uD7FF.

Looking at the code in regexp-lexer.js, I've deduced that the problem occurs when jison-gho computes an inverted character set when it tries optimizing the regular expression. When it computes the inverted set, one of the range boundaries is 0xD7FF + 1, and the error is triggered.

I can understand complaining about a user-written regular expression that goes into the supplementary plane, but here we're talking about a regular expression that is computed behind the scenes. Should there even be an error raised on the inverted set which is computed internally?

arrow notation barfs on comments

zaaaach/jison parses

expr: 'foo' -> 1 // comment
;

jison-gho 0.6.1-216 and master fail if there's a comment following the semantic action.

(zaaaaach also parses extra ';'s

expr: 'foo' -> 1; // comment
;

but I think that's a bug.)

add CLI + API option to enable/disable/pick-a-mode for (action) code snippets' validation

I.e. add ability to manipulate the jison-helpers-lib::checkActionBlock() behaviour.

Currently all code blocks MUST pass JavaScript validation performed by that API through using esprima/recast, but several examples exist which showcase grammars with admittedly non-working, non-compiling code and we don't want to bother about that.

Food For Thought

The other, more sensible?, use case is when you create parsers/lexers and code your action blocks and/or other code sections in there in another language that compiles to JavaScript, say TypeScript. Currently jison does NOT support that mode of usage; it MAY be useful to provide a pre-compile/pre-process plugin interface for said API so that a user-determined and user-specified toolchain CAN be applied to every chunk of code and thus help produce a validated parser/lexer instance or sourcecode. πŸŽ‰

`lexer.conditions[K].rules` grows with each call to parse

The lexer code in setInput switches the .rules to 1-indexed:

for (var k in conditions) {
  var spec = conditions[k];
  var rule_ids = spec.rules;
  var len = rule_ids.length;
  var rule_regexes = new Array(len + 1); // slot 0 is unused; we use a 1-based index approach here to keep the hottest code in `lexer_next()` fast and simple! 
  var rule_new_ids = new Array(len + 1);

  for (var i = 0; i < len; i++) {
    var idx = rule_ids[i];
    var rule_re = rules[idx];
    rule_regexes[i + 1] = rule_re;
    rule_new_ids[i + 1] = idx;
  }

  spec.rules = rule_new_ids;
  spec.__rule_regexes = rule_regexes;
  spec.__rule_count = len;
}

If you generate a parser and use it a second time, .rules ends up with an undefined at index [1].
For some reason "foo".match(undefined) matches so you end up an in endless look where undefined matches but doesn't advance the cursor.

I found a terrible and a maybe-not-so-bad work-around.

terrible - ignore leading undefines

      for (var i = 1; i <= len; i++) {
        tempMatch = regexes[i] ? this._input.match(regexes[i]) : null;

pros: one line to change
cons: it grows by one undefined on each iteration

maybe-no-so-bad - s/len \+ 1/len/ and unshift into the rules

        for (var k in conditions) {
          var spec = conditions[k];
          var rule_ids = spec.rules;
          var len = rule_ids.length;
          var rule_regexes = new Array(len);             // slot 0 is unused; we use a 1-based index approach here to keep the hottest code in `lexer_next()` fast and simple! 
          var rule_new_ids = new Array(len);

          for (var i = 0; i < len; i++) {
            var idx = rule_ids[i];
            var rule_re = rules[idx];
            rule_regexes[i] = rule_re;
            rule_new_ids[i] = idx;
          }

          spec.rules = rule_new_ids;
          spec.__rule_regexes = rule_regexes;
          spec.__rule_count = len;
        }

and in the constructor, prefix each rules block with an undefined:

Object.keys(lexer.conditions).forEach(c => lexer.conditions[c].rules.unshift(null))

I did this just after

    conditions: {
      'INITIAL': {
        rules: [...]

and was able to use the parser a zillion times (well, north of 2000, at least).

lexer rules for jison (ebnf-parser and lex-parser modules) b0rk very late on unterminated string

Current lexer rules don't identify an unterminated string in action code when it crosses a newline.

Example typo, where string is started as ES6 template but forgotten to terminate it with a backquote, keeping the old double quote at the end: note the code chunk at the first TODO:

option
    : NAME[option]
        { $$ = [$option, true]; }
    | NAME[option] '=' OPTION_STRING_VALUE[value]
        { $$ = [$option, $value]; }
    | NAME[option] '=' OPTION_VALUE[value]
        { $$ = [$option, parseValue($value)]; }
    | NAME[option] '=' NAME[value]
        { $$ = [$option, parseValue($value)]; }
    | NAME[option] '=' error
        {
            // TODO ...
            yyerror(`named %option value error for ${$option}?\n\n  Erroneous area:\n" + prettyPrintRange(yylexer, @error, @option));
        }
    | NAME[option] error
        {
            // TODO ...
            yyerror("named %option value assignment error?\n\n  Erroneous area:\n" + prettyPrintRange(yylexer, @error, @option));
        }
    ;
...
%%
...
// 500 lines down, right smack in the middle of the trailing code chunk,
// jison barfs a hairball about a "possibly missing semicolon?"
//     |:-(

results in an error report about 500 (!) lines down.
Error diagnosis was easy, but required backpedaling through git log (diff inspection via TortoiseGit and Beyond Compare): jison should ideally be able to report something sensible near the error origin and not require the user to pull out the rest of their dev toolchest to dig out the mistake.

`react-scripts build` fails to compile generated parser

I have the following errors:
Line 1008: 'assert' is not defined no-undef
Line 1506: 'yyrulelen' is not defined no-undef
Line 1525: 'yyrulelen' is not defined no-undef

Can you publish a fixed version, please?
(Thank you very much as i really need a fix for this!)

Use async/await in grammar executed code

I am writing a grammar to interact with a asynchronous service. So I would like to use await in the executed code.

...
| TYPE SELECTOR expr
    {
        await yy.type($2, $3);
    }
| ...

As the code block is not an async function, the await can't be used there.

Would be any way to call async functions and wait until its completion to continue to the next grammar element ?

action code sometimes doesn't get expanded properly when using ES6/2017 string templates

`this is an ES6 string which has double-quoted jison var ref like this: "${@var}"`

in the above situation, with the quotes inside the template string, around the jison variable reference, such a variable doesn't get expanded.

The problem existed in the jison grammar spec itself, but a quick hack at least has that one circumvented for the time being:

var js_var = @var;  // this DOES get expanded!
...`this is an ES6 string which has double-quoted jison var ref like this: "${js_var}"`
// ^^^ the above string template delivers as expected and doesn't barf a hairball when executed.

(See bnf.y for an example of this)

This needs to be fixed; we'll need the new tweaked recast library for real for this, as we've just hit the limit of regex replace power. ;-)

Error reported for negative indexed location tracking references, e.g. `@-1`

bison+jison-gho support $n for n <= 0:

Quoting the bison manual at https://www.gnu.org/software/bison/manual/html_node/Actions.html (bold emphasis mine):

$n with n zero or negative is allowed for reference to tokens and groupings on the stack before those that match the current rule.
This is a very risky practice, and to use it reliably you must be certain of the context in which the rule is applied. Here is a case in which you can use this reliably:

foo:
  expr bar '+' expr  { … }
| expr bar '-' expr  { … }
;
bar:
  %empty    { previous_expr = $0; }
;

As long as bar is used only in the fashion shown here, $0 always refers to the expr which precedes bar in the definition of foo.

As bison/jison/jison-gho support location references as well, format @n or @label, the above implies @-1 is a legal location ref.

That FAILS since 0.6.0-??? when we switched to parsing / validating action code chunks via a patched recast library: https://github.com/GerHobbelt/recast :: https://www.npmjs.com/package/@gerhobbelt/recast which recognizes all jison-gho supported reference types: $n, @n, #n, ##n, #label#... but still barfs on @-1.

Note

Given this, I expect my patched recast/esprima also barfs on #-1 and ##-1...

Do parsers share a global state?

I'm using a new instance of Parser and the first time I execute it it works ok, but the second time it hangs. I assumed using a new instance would create a clean state but looks like I need to reset some global state?

This is what I'm currently using:

const Parser = require('myparser').Parser
....
//This will run multiple times
const p = new Parser();
p.parse(...)

Longest match isn't respect with literal strings (differs from zaach/jison)

I think I'm noticing jison being too greedy with lex grammars that contain string literals.

Here's a tiny grammar that allows you to and/or strings together, returning a binary tree.

/* lexical grammar */
%lex
%%

<<EOF>>          return 'EOF'
\s                       /* skip whitespace */
"and"                return 'AND'
"or"                   return 'OR'
[a-z]*                return 'WORD'


/lex

/* operator associations and precedence */

%left 'AND'
%left 'OR'

%start expressions

%% /* language grammar */

expressions
    : e EOF
        {return $1;}
    ;

e
    : e AND e
        {$$ = {type:"and", lft: $1, rgt:$3};}
    | e OR e
        {$$ = {type:"or", lft: $1, rgt:$3};}
    | WORD
        {$$ = $1}
    ;

When I try a query like paul and andre, I get a parser error (it tries to read the and... of andre as an AND node, rather than a WORD node.

However, when I use the same grammar in the original jison, it parses as expected.

Strange behavior on parse/lex errors

I am new to jison (my first day of using it) so excuse my probable ignorance in advance. There are some issues after switching to your fork from classical jison which I'd like to report here and get some input.

Issues

1) No concise error messages?

-t flag generates too much extra output, while the classical jison reported nice concise error messages.
classic jison:

$ node cool.js bad.cl
Parse error on line 12:
... type identifier *)Class b inherits A {
----------------------^
Expecting 'CLASS', got 'TYPEID'
C:\code\sandbox\[Stanford] Compilers\cool-jison\cool.js:394
                    throw new Error(errStr || 'Parsing halted while starting to recover from another error.');

jison-gho: too long output (impractical other than deep debugging)

I could not find a similar mechanism for error messaging in your fork (or it does not work on my grammar for some reason). I'd like to have a kind of concise error messages that used to be in classical jison in my final compiler.

2) Parser hanging on some bad files

For some reason, the parser generated from your fork hangs on some bad inputs , while classic jison produced a parser that failed gracefully with error message.

classic jison:

$ node cool.js null_in_code.cl.cool
Parse error on line 1:
...haracter in code *)null character is he
----------------------^
Expecting 'CLASS', got 'OBJECTID'
Lexer error at line 1:
...character is here => <-)
-----------------------^
 Skipping token:
C:\code\sandbox\[Stanford] Compilers\cool-jison\cool.js:394
                    throw new Error(errStr || 'Parsing halted while starting to recover from another error.');

jison-gho: hangs
I can reproduce this on files null_in_code.cl.cool and bad.cl, for example (see below).

3) Error recovery

I cannot get error recovery to work for my grammar. I ported it from flex/bison and the error recovery mechanism worked there. I recognize not all features of flex/bison are (neither will be) supported, but maybe I am doing something intrinsically wrong.

In other words, I am getting a single error and parser exits.
Expected: parser skips to the next error by reducing error non-terminal.
Example from my grammar:

class_list
: class	';'		/* single class */
  { $$ = ["CLASS_LIST", {}, $1]; }
| class_list class ';'	/* several classes */
  { $$ = prependChild($1, $2);  }
| error ';'    /* error recovery: skip to next class */
;

Notice I am using the same grammar in both classic jison and your fork.
So there are some backward compatibility differences.

Steps to reproduce / grammar

You can get grammar and test files from here:
https://github.com/roman-spiridonov/sandbox/tree/master/%5BStanford%5D%20Compilers/cool-jison

npm install, then npm run jison or npm run jison-gho to build the corresponding parser cool.js.
Then execute commands from issues above.

use @roman-spiridorov's coolc grammar and test files for regression testing

There are a few hairy features (and bugs) that are observable directly or indirectly when running the examples in @roman-spiridorov's coolc project: series of errors in a 'coolc' source file being fed to a 'coolc' parser which doesn't handle all of them in an intuitive error recovery way (as these errors are close together and coding error recovery in a grammar is no sine cure either...)

Plus at least one example in there exhibits a bug which is now fixed (see also #21), but took at least three attempts before my brain caught on and fixed it right -- or so I believe. Anyway: regression of that problem is a high probability and should be regression tested in the jison test set to ensure we're not silently failing in the future.

Error handling?

Is there any guide or reference documentation about how i can find exact place where error is occurred? I have found a way to find lexer error position, but for parser there are only line number and there are no offset in the document (or in line). Am i missing something?

Now that we use recast->esprima, we can simplify the lexer grammar for regexes and action blocks

Re-use the (customized) esprima parser that comes with recast to help us parse lexer regexes and lexer and parser action blocks: we can re-use the esprima scanner for those parts!

Suggestion: use a simple lexer regex to match the start of the regex or action code block input and then consume as much as necessary using the esprima scanner to produce the appropriate jison-lex / jison grammar token.

Git version looks older than NPM

Hi. I was wondering if you could push the latest version of this project to its Git repo as well, cause unless I'm missing something, it looks like the NPM version is ahead. I can't even successfully build the project without having to hack my way through. This is a great work, and I'd love to hack on it a bit. Thank you!

GnuCOBOL

Hi

I want to use the jison to generate a parser for the GnuCobol grammar file, can jison do this? I have already tried however there was some errors.

Thanks
Simon

Compile to ES5?

Latest jison 0.6.0-191 uses a few ES2016 features (string templates, ...args) which are not supported in older NodeJS and/or browsers (Node 5.0 and older: TravisCI fails ATM: https://travis-ci.org/GerHobbelt/jison )

It might be advisable to pull jison et al through babel before publishing in order to prevent issues stemming from running jison in older environments.

Note:
extra checks required to ensure the generated parsers and lexers are ES5 safe!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.