GithubHelp home page GithubHelp logo

node-odbc's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

node-odbc's Issues

Unsupported datatype: Hyperlink

Hi,

I only started programming at the start of the month so apologies if any of this is incorrect.

My environment:

Windows 10 Pro version 1903 - 64bit
Node.JS version 12.9.0 - 32bit
MS Access 2013 SP1 - 32bit
.MDB database format

When querying a particular table in my MS Access database no results or errors are displayed. After a lot of investigation I found that if I delete the only field with the datatype of "Hyperlink" then everything works as expected.

I turned on debug and output stops as soon as this field (InvoicePath) with datatype Hyperlink is processed:

[SQLHENV: 02D004D0][SQLHDBC: 02D07448][SQLHSTMT: 02D077A8] ODBCConnection::BindColumns(): Running SQLDescribeCol(StatementHandle = 02D077A8, ColumnNumber = 43, ColumnName = , BufferLength = 64, NameLength = 0, DataType = 0, ColumnSize = 0, DecimalDigits = 0, Nullable = 0)
[SQLHENV: 02D004D0][SQLHDBC: 02D07448][SQLHSTMT: 02D077A8] ODBCConnection::BindColumns(): SQLDescribeCol passed: ColumnName = InvoicePath, NameLength = 11, DataType = -1, ColumnSize = 1073741823, DecimalDigits = 0, Nullable = 1

Please could you investigate?

Much appreciated and thanks

James

Can't install ODBC 2.0.0 version of library on Windows

Hi,
Thank you for upgrading to 2.0.0 with performance fixes. I think it really helps and so far I was able to install and test new version only on Linux.

On Windows, I constantly see below error:
error C2589: '(': illegal token on right side of '::' (compiling source file ..\src\odbc_statement.cpp)

there are same errors are on all .cpp files in build, as error is coming from odbc.h

If I fix this in my local, I will post the fix.

Thanks

Error: `gmake` failed with exit code: 2

Hey!

Gettings this error when installing on IBM I:

remote: gmake: Entering directory '/container1/project/code/node_modules/odbc/build'
remote:   CXX(target) Release/obj.target/odbc/src/odbc.o
remote: In file included from ../src/odbc.cpp:19:0:
remote: /container1/project/code/node_modules/node-addon-api/napi.h:5:22: fatal error: functional: No such file or directory
remote:  #include <functional>
remote:                       ^
remote: compilation terminated.
remote: gmake: *** [odbc.target.mk:127: Release/obj.target/odbc/src/odbc.o] Error 1
remote: gmake: Leaving directory '/container1/project/code/node_modules/odbc/build'
remote: gyp ERR! build error
remote: gyp ERR! stack Error: `gmake` failed with exit code: 2
remote: gyp ERR! stack     at ChildProcess.onExit (/QOpenSys/pkgs/lib/container1/lib/node_modules/npm/node_modules/node-gyp/lib/build.js:262:23)
remote: gyp ERR! stack     at ChildProcess.emit (events.js:198:13)
remote: gyp ERR! stack     at Process.ChildProcess._handle.onexit (internal/child_process.js:248:12)
remote: gyp ERR! System OS400 7.3
remote: gyp ERR! command "/QOpenSys/pkgs/lib/container1/bin/node" "/QOpenSys/pkgs/lib/container1/lib/node_modules/npm/node_modules/node-gyp/bin/node-gyp.js" "configure" "build"
remote: gyp ERR! cwd /container1/project/code/node_modules/odbc
remote: gyp ERR! node -v v10.16.3
remote: gyp ERR! node-gyp -v v3.8.0
remote: gyp ERR! not ok
remote: npm WARN [email protected] No repository field.
remote: npm WARN [email protected] No license field.

Does anyone know what could be wrong?

NPM install error

Trying to re-install your package produced this error as of 30 minutes ago.

OS: Windows 10 Enterprise, v. 1803.

gyp ERR! configure error
gyp ERR! stack Error: Command failed: 
C:\Users\{NAME}\AppData\Local\Programs\Python\Python37-32\python.EXE -c import sys; print "%s.%s.%s" % sys.version_info[:3];
gyp ERR! stack   File "<string>", line 1
gyp ERR! stack     import sys; print "%s.%s.%s" % sys.version_info[:3];
gyp ERR! stack                                ^
gyp ERR! stack SyntaxError: invalid syntax
gyp ERR! stack
gyp ERR! stack     at ChildProcess.exithandler (child_process.js:294:12)
gyp ERR! stack     at ChildProcess.emit (events.js:189:13)
gyp ERR! stack     at maybeClose (internal/child_process.js:970:16)
gyp ERR! stack     at Process.ChildProcess._handle.onexit (internal/child_process.js:259:5)
gyp ERR! System Windows_NT 10.0.17134
gyp ERR! command "C:\\Program Files\\nodejs\\node.exe"          
"C:\\Users\\{NAME}\\AppData\\Roaming\\npm\\node_modules\\npm\\node_modules\\node-gyp\\bin\\node-gyp.js" "configure" "build"
gyp ERR! cwd C:\Users\{NAME}\AppData\Roaming\npm\node_modules\odbc
gyp ERR! node -v v10.15.3
gyp ERR! node-gyp -v v3.8.0
gyp ERR! not ok
npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! [email protected] install: `node-gyp configure build`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the [email protected] install script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.

Any feedback on this bug would be highly appreciated!

Failing to establish a connection to a 32 bit driver

OS - Windows 10, Powershell
DB - Actian SQL

I'm getting an error message from the ODBC Driver manager indicating that the node-odbc driver is attempting to make connections using 64 bit ODBC. Unfortunately the database driver for my database only supports 32 bit.

I did attempt running powershell and cmd as 32 bit but was still unsuccessful

Are there any examples for connecting to 32 bit?

'use strict'

const { series } = require('gulp')
const odbc = require('odbc')

const connection = async done => {
  try {
    console.log('connecting to ODBC driver')
    const conn = await odbc.connect('DSN=SOME_ODBC_NAME')
    console.log('connection established')
    // close connection
    conn.close(msg => {
      console.log('closed connection')
      done()
    })
  } catch (err) {
    console.log('error', err)
  }
}

module.exports = series(
  connection
)
PS C:\Users\alilland\Documents\dev\forecast-scraper> yarn run devGulp other:odbc                                                                                                                                                             yarn run v1.19.1
$ set NODE_ENV=development& gulp other:odbc
[13:50:41] Using gulpfile ~\Documents\dev\forecast-scraper\gulpfile.js
[13:50:41] Starting 'other:odbc'...
[13:50:41] Starting 'connection'...
connecting to ODBC driver
error [Error: [odbc] Error connecting to the database] {
  odbcErrors: [
    {
      state: 'IM014',
      code: 0,
      message: '[Microsoft][ODBC Driver Manager] The specified DSN contains an architecture mismatch between the Driver and Application'
    }
  ]
}
[13:50:41] Finished 'connection' after 4.71 ms
[13:50:41] Finished 'other:odbc' after 6.71 ms
Done in 1.21s.
PS C:\Users\alilland\Documents\dev\forecast-scraper> 

Auto generated ids

Hello!

We are attempting to switch away from the jdbc and idb connectors in favor of odbc to connect to a DB2 database. I'm looking for a way to get the id value from an auto generated identity column after I execute an insert statement. Is there a way to do that?

For example, one of our tables has an identify column defined like this...
RECORD_ID FOR COLUMN RCID DECIMAL(9, 0) GENERATED BY DEFAULT AS IDENTITY ( START WITH 1 INCREMENT BY 1 NO MINVALUE NO MAXVALUE NO CYCLE ORDER CACHE 10 ) ,

With the IDB connector, I can get the the new identity value this way...
results[0][IdbConnection.GENERATED_ID_FIELD]

The results I get back from statement.execute() looks like this...
[ statement: 'insert into table(column) values(?) ', parameters: [ 'It worked!' ], return: undefined, count: 1, columns: [] ]

Errors when inserting with prepared statement

I'm trying to run node-odbc against vertica and I'm getting a strange error when running a simple prepared statement.
The row is inserted in the db, but the program exits at statement.execute() with this error:

$ node index.js
undefined:0

Error: Invalid argument

const odbc = require('odbc');

async function executeExample() {
  const connection = await odbc.connect('Driver=/Library/Vertica/ODBC/lib/libverticaodbc.dylib;Servername=localhost;Port=5433;Database=testdb;UserName=dbadmin;Password=')
  const statement = await connection.createStatement();
  await statement.prepare('INSERT INTO testtable (name) VALUES(?)');
  await statement.bind(['joe']);
  const result = await statement.execute();
  console.log(result);
  await statement.close();
  await connection.close();
}

executeExample();

Electron application crashes when trying to query a table starting with '@' in SQL Server

I'm using this library to connect to an SQL Server database which has some tables with an '@' prefix and some that don't. On the ones that don't everything works fine, but when the app tries to query a table that has the prefix, my app simply crashes completely and quits, without any error message.

This is the code I'm using:

let connection;
        const paramTable = `[${this.settings.ClientDatabaseName}].[dbo].[@OK1_PDV_PAGO]`;
        try {
            // Creation of new validation tables for LiliPink
            connection = await odbc.connect(this.connectionString);
            const queryParam = `
            SELECT * FROM ${paramTable};
            `;
            const resQuery = await connection.query(queryParam);
            log.info('SAP: GetSucursal');
            const id = resQuery[0].U_Almacen.replace(/ALMT/g, '');
            await connection.close();
            return Number(id);
        } catch (ex) {
            log.error('AddRedemption: ', ex);
            await connection?.close();
        }

When I put debug = true in bindings.gyp I get the following:

LHSTMT: 0D898E10] ODBCConnection::FetchAll(): SQLFetch succeeded: Stored 1 rows of data, each with 1 columns
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D896998][SQLHSTMT: 0D898E10] ODBCConnection::FetchAll(): Running SQLCloseCursor(StatementHandle = 0D898E10) (Running multiple times)        
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D896998][SQLHSTMT: 0D898E10] ODBCConnection::FetchAll(): SQLCloseCursor succeeded
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D896998]ODBCConnection::QueryAsyncWorker::OnOk()
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D88FC30] ODBCConnection::~ODBCConnection
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D88D7B8] ODBCConnection::~ODBCConnection
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D8920A8] ODBCConnection::~ODBCConnection
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D894520] ODBCConnection::~ODBCConnection
  [SQLHENV: 0A00CAF0][SQLHDBC: 00000000] ODBCConnection::~ODBCConnection
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D896998] ODBCConnection::~ODBCConnection
  [SQLHENV: 0A00CAF0] ODBC::Connect()
  [SQLHENV: 0A00CAF0] ODBC::ConnectAsyncWorker::Execute()
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8] ODBC::ConnectAsyncWorker::Execute(): Calling SQLGetInfo(ConnectionHandle = 0D89E3D8, InfoType = 30 (SQL_MAX_COLUMN_NAME_LEN), InfoValuePtr = 09C13408, BufferLength = 2, StringLengthPtr = 0)
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8] ODBC::ConnectAsyncWorker::Execute(): SQLGetInfo succeeded: SQLRETURN = 0, InfoValue = 128
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8] ODBC::ConnectAsyncWorker::Execute(): Calling SQLGetInfo(ConnectionHandle = 0A00CAF0, InfoType = 72 (SQL_TXN_ISOLATION_OPTION), InfoValuePtr = 09C1340C, BufferLength = 4, StringLengthPtr = 0)
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8] ODBC::ConnectAsyncWorker::Execute(): SQLGetInfo succeeded: SQLRETURN = 0, InfoValue = 15
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8] ODBC::ConnectAsyncWorker::OnOk()
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8] ODBCConnection::Query()
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8] ODBCConnection::QueryAsyncWorker::Execute(): Running SQL '
              SELECT * FROM [PRUEBASLEAL].[dbo].[@OK1_PDV_PAGO];
              '
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8] ODBCConnection::QueryAsyncWorker::Execute(): Running SQLAllocHandle(HandleType = SQL_HANDLE_STMT, InputHandle = 0D89E3D8, OutputHandlePtr = 10F59AC8)
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::QueryAsyncWorker::Execute(): Calling SQLExecDirect(Statment Handle = 0D89F0A0, StatementText = 
              SELECT * FROM [PRUEBASLEAL].[dbo].[@OK1_PDV_PAGO];
              , TextLength = SQL_NTS)
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::QueryAsyncWorker::Execute(): SQLExecDirect passed with SQLRETURN = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::RetrieveResultSet()
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::RetrieveResultSet(): Running SQLRowCount(StatementHandle = 0D89F0A0, RowCount = 0)
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::RetrieveResultSet(): SQLRowCount passed: SQLRETURN = 0, RowCount = -1
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns()
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLNumResultCols(StatementHandle = 0D89F0A0, ColumnCount = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLNumResultCols passed: ColumnCount = 32
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLDescribeCol(StatementHandle = 0D89F0A0, ColumnNumber = 1, ColumnName 
= , BufferLength = 128, NameLength = 0, DataType = 0, ColumnSize = 0, DecimalDigits = 0, Nullable = 0)
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLDescribeCol passed: ColumnName = DocEntry, NameLength = 8, DataType = 4, ColumnSize = 10, DecimalDigits = 0, Nullable = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLBindCol(StatementHandle = 0D89F0A0, ColumnNumber = 1, TargetType = -16, TargetValuePtr = 10F5DFA0, BufferLength = 10, StrLen_or_Ind = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLBindCol succeeded: StrLeng_or_IndPtr = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLDescribeCol(StatementHandle = 0D89F0A0, ColumnNumber = 2, ColumnName 
= , BufferLength = 128, NameLength = 0, DataType = 0, ColumnSize = 0, DecimalDigits = 0, Nullable = 0)
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLDescribeCol passed: ColumnName = DocNum, NameLength = 6, DataType = 4, ColumnSize = 10, DecimalDigits = 0, Nullable = 1
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLBindCol(StatementHandle = 0D89F0A0, ColumnNumber = 2, TargetType = -16, TargetValuePtr = 10F5DE20, BufferLength = 10, StrLen_or_Ind = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLBindCol succeeded: StrLeng_or_IndPtr = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLDescribeCol(StatementHandle = 0D89F0A0, ColumnNumber = 3, ColumnName 
= , BufferLength = 128, NameLength = 0, DataType = 0, ColumnSize = 0, DecimalDigits = 0, Nullable = 0)
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLDescribeCol passed: ColumnName = Period, NameLength = 6, DataType = 4, ColumnSize = 10, DecimalDigits = 0, Nullable = 1
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLBindCol(StatementHandle = 0D89F0A0, ColumnNumber = 3, TargetType = -16, TargetValuePtr = 10F5DE68, BufferLength = 10, StrLen_or_Ind = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLBindCol succeeded: StrLeng_or_IndPtr = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLDescribeCol(StatementHandle = 0D89F0A0, ColumnNumber = 4, ColumnName 
= , BufferLength = 128, NameLength = 0, DataType = 0, ColumnSize = 0, DecimalDigits = 0, Nullable = 0)
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLDescribeCol passed: ColumnName = Instance, NameLength = 8, DataType = 5, ColumnSize = 5, DecimalDigits = 0, Nullable = 1
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLBindCol(StatementHandle = 0D89F0A0, ColumnNumber = 4, TargetType = -16, TargetValuePtr = 10F32AD0, BufferLength = 5, StrLen_or_Ind = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLBindCol succeeded: StrLeng_or_IndPtr = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLDescribeCol(StatementHandle = 0D89F0A0, ColumnNumber = 5, ColumnName 
= , BufferLength = 128, NameLength = 0, DataType = 0, ColumnSize = 0, DecimalDigits = 0, Nullable = 0)
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLDescribeCol passed: ColumnName = Series, NameLength = 6, DataType = 4, ColumnSize = 10, DecimalDigits = 0, Nullable = 1
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLBindCol(StatementHandle = 0D89F0A0, ColumnNumber = 5, TargetType = -16, TargetValuePtr = 10F5DE98, BufferLength = 10, StrLen_or_Ind = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLBindCol succeeded: StrLeng_or_IndPtr = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLDescribeCol(StatementHandle = 0D89F0A0, ColumnNumber = 6, ColumnName 
= , BufferLength = 128, NameLength = 0, DataType = 0, ColumnSize = 0, DecimalDigits = 0, Nullable = 0)
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLDescribeCol passed: ColumnName = Handwrtten, NameLength = 10, DataType = 1, ColumnSize = 1, DecimalDigits = 0, Nullable = 1
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLBindCol(StatementHandle = 0D89F0A0, ColumnNumber = 6, TargetType = 1, TargetValuePtr = 10F32A80, BufferLength = 2, StrLen_or_Ind = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLBindCol succeeded: StrLeng_or_IndPtr = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLDescribeCol(StatementHandle = 0D89F0A0, ColumnNumber = 7, ColumnName 
= , BufferLength = 128, NameLength = 0, DataType = 0, ColumnSize = 0, DecimalDigits = 0, Nullable = 0)
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLDescribeCol passed: ColumnName = Canceled, NameLength = 8, DataType = 1, ColumnSize = 1, DecimalDigits = 0, Nullable = 1
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLBindCol(StatementHandle = 0D89F0A0, ColumnNumber = 7, TargetType = 1, TargetValuePtr = 10F32A90, BufferLength = 2, StrLen_or_Ind = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLBindCol succeeded: StrLeng_or_IndPtr = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLDescribeCol(StatementHandle = 0D89F0A0, ColumnNumber = 8, ColumnName 
= , BufferLength = 128, NameLength = 0, DataType = 0, ColumnSize = 0, DecimalDigits = 0, Nullable = 0)
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLDescribeCol passed: ColumnName = Object, NameLength = 6, DataType = -9, ColumnSize = 20, DecimalDigits = 0, Nullable = 1
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLBindCol(StatementHandle = 0D89F0A0, ColumnNumber = 8, TargetType = -8, TargetValuePtr = 10F5D930, BufferLength = 42, StrLen_or_Ind = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLBindCol succeeded: StrLeng_or_IndPtr = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLDescribeCol(StatementHandle = 0D89F0A0, ColumnNumber = 9, ColumnName 
= , BufferLength = 128, NameLength = 0, DataType = 0, ColumnSize = 0, DecimalDigits = 0, Nullable = 0)
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLDescribeCol passed: ColumnName = LogInst, NameLength = 7, DataType = 4, ColumnSize = 10, DecimalDigits = 0, Nullable = 1
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLBindCol(StatementHandle = 0D89F0A0, ColumnNumber = 9, TargetType = -16, TargetValuePtr = 10F5DDF0, BufferLength = 10, StrLen_or_Ind = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLBindCol succeeded: StrLeng_or_IndPtr = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLDescribeCol(StatementHandle = 0D89F0A0, ColumnNumber = 10, ColumnName = , BufferLength = 128, NameLength = 0, DataType = 0, ColumnSize = 0, DecimalDigits = 0, Nullable = 0)
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLDescribeCol passed: ColumnName = UserSign, NameLength = 8, DataType = 4, ColumnSize = 10, DecimalDigits = 0, Nullable = 1
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLBindCol(StatementHandle = 0D89F0A0, ColumnNumber = 10, TargetType = -16, TargetValuePtr = 10F5E018, BufferLength = 10, StrLen_or_Ind = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLBindCol succeeded: StrLeng_or_IndPtr = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLDescribeCol(StatementHandle = 0D89F0A0, ColumnNumber = 11, ColumnName = , BufferLength = 128, NameLength = 0, DataType = 0, ColumnSize = 0, DecimalDigits = 0, Nullable = 0)
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLDescribeCol passed: ColumnName = Transfered, NameLength = 10, DataType = 1, ColumnSize = 1, DecimalDigits = 0, Nullable = 1
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLBindCol(StatementHandle = 0D89F0A0, ColumnNumber = 11, TargetType = 1, TargetValuePtr = 10F32B80, BufferLength = 2, StrLen_or_Ind = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLBindCol succeeded: StrLeng_or_IndPtr = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLDescribeCol(StatementHandle = 0D89F0A0, ColumnNumber = 12, ColumnName = , BufferLength = 128, NameLength = 0, DataType = 0, ColumnSize = 0, DecimalDigits = 0, Nullable = 0)
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLDescribeCol passed: ColumnName = Status, NameLength = 6, DataType = 1, ColumnSize = 1, DecimalDigits = 0, Nullable = 1
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLBindCol(StatementHandle = 0D89F0A0, ColumnNumber = 12, TargetType = 1, TargetValuePtr = 10F329B0, BufferLength = 2, StrLen_or_Ind = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLBindCol succeeded: StrLeng_or_IndPtr = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLDescribeCol(StatementHandle = 0D89F0A0, ColumnNumber = 13, ColumnName = , BufferLength = 128, NameLength = 0, DataType = 0, ColumnSize = 0, DecimalDigits = 0, Nullable = 0)
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLDescribeCol passed: ColumnName = CreateDate, NameLength = 10, DataType = 93, 
ColumnSize = 23, DecimalDigits = 3, Nullable = 1
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLBindCol(StatementHandle = 0D89F0A0, ColumnNumber = 13, TargetType = 1, TargetValuePtr = 10F38DC8, BufferLength = 24, StrLen_or_Ind = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLBindCol succeeded: StrLeng_or_IndPtr = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLDescribeCol(StatementHandle = 0D89F0A0, ColumnNumber = 14, ColumnName = , BufferLength = 128, NameLength = 0, DataType = 0, ColumnSize = 0, DecimalDigits = 0, Nullable = 0)
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLDescribeCol passed: ColumnName = CreateTime, NameLength = 10, DataType = 5, ColumnSize = 5, DecimalDigits = 0, Nullable = 1
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLBindCol(StatementHandle = 0D89F0A0, ColumnNumber = 14, TargetType = -16, TargetValuePtr = 10F32A60, BufferLength = 5, StrLen_or_Ind = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLBindCol succeeded: StrLeng_or_IndPtr = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLDescribeCol(StatementHandle = 0D89F0A0, ColumnNumber = 15, ColumnName = , BufferLength = 128, NameLength = 0, DataType = 0, ColumnSize = 0, DecimalDigits = 0, Nullable = 0)
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLDescribeCol passed: ColumnName = UpdateDate, NameLength = 10, DataType = 93, 
ColumnSize = 23, DecimalDigits = 3, Nullable = 1
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLBindCol(StatementHandle = 0D89F0A0, ColumnNumber = 15, TargetType = 1, TargetValuePtr = 10F38E88, BufferLength = 24, StrLen_or_Ind = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): SQLBindCol succeeded: StrLeng_or_IndPtr = 0
  [SQLHENV: 0A00CAF0][SQLHDBC: 0D89E3D8][SQLHSTMT: 0D89F0A0] ODBCConnection::BindColumns(): Running SQLDescribeCol(Stat

Create pre-built binaries for commonly used systems

Currently, odbc will install from the source files when downloaded from NPM. This is prone to breaking and requires special flags/definitions on each system, and often requires users to have special software installed (e.g. Visual Studio compilers on Windows) (see #34, #59).

Instead, should use a system like (node-pre-gyp)[https://www.npmjs.com/package/node-pre-gyp] to download and install pre-built binaries for commonly used systems, and only fallback to local compilation when there is no binary for a system.

Systems to support should include:

  • Windows (x86)
  • Windows (ARM)
  • MacOS
  • Linux (x86)
  • Linux (POWER)
  • IBM i

NPM install error

Hello,

I am trying to install the package but npm i odbc fail with the following message:

> [email protected] install /my_project_path/node_modules/odbc
> node-gyp configure build

  CXX(target) Release/obj.target/odbc/src/odbc.o
In file included from ../src/odbc.cpp:19:
/my_project_path/node_modules/node-addon-api/napi.h:30:6: error: Exception support not detected. Define either NAPI_CPP_EXCEPTIONS or NAPI_DISABLE_CPP_EXCEPTIONS.
    #error Exception support not detected. \
     ^
1 error generated.
make: *** [Release/obj.target/odbc/src/odbc.o] Error 1
gyp ERR! build error 
gyp ERR! stack Error: `make` failed with exit code: 2
gyp ERR! stack     at ChildProcess.onExit (/usr/local/lib/node_modules/npm/node_modules/node-gyp/lib/build.js:262:23)
gyp ERR! stack     at ChildProcess.emit (events.js:189:13)
gyp ERR! stack     at Process.ChildProcess._handle.onexit (internal/child_process.js:248:12)
gyp ERR! System Darwin 18.6.0
gyp ERR! command "/usr/local/bin/node" "/usr/local/lib/node_modules/npm/node_modules/node-gyp/bin/node-gyp.js" "configure" "build"
gyp ERR! cwd /my_project_path/proxy-accelo/node_modules/odbc
gyp ERR! node -v v10.15.3
gyp ERR! node-gyp -v v3.8.0
gyp ERR! not ok 
npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! [email protected] install: `node-gyp configure build`
npm ERR! Exit status 1
npm ERR! 
npm ERR! Failed at the [email protected] install script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.

npm ERR! A complete log of this run can be found in:
npm ERR!     ~/.npm/_logs/2019-06-24T14_11_34_214Z-debug.log

FYI: I am on Mac OS Mojave (10.14.5).

Any idea what I am not doing right?

Thanks in advance for your time,
Best regards.

Error messages too generic

Hi there,

When executing a query which results in a database error (e.g. if there is something wrong with the syntax of the query), the odbc package does throw an exception as expected. The exception does contain a very generic error message however.

Example:

const statement = await connection.createStatement();
await statement.prepare("select * from nonexistingtable");
await statement.execute();

I expect the abovementioned code to throw this error message:
Invalid object name 'nonexistingtable'.

Yet, node-odbc throws this error message
[odbc] Error executing the statement

Is there any way to retrieve the error which caused the exception thrown by node-odbc?

CP1250 encoding problem from odbc data

Hello,

I have so many problems with non-unicode characters from database.
The database is IBM Informix, and DB_LOCALE=cs_cz.cp1250 is set in odbc.ini.
I can get correct result when run from ISQL (smal SQL tool installed with Informix SDK), the result is like bellow:
image

Putty terminal is set to CP1250
image
So, I guess ODBC works correctly.

But, from node, with node-odbc this simple code gives totally wrong results:

`const odbc = require('odbc');

async function connectToDatabase() {
const connectionConfig = {
connectionString: 'DSN=cistoca;encoding=cp1250',
connectionTimeout: 10,
loginTimeout: 10,
}
const connection2 = await odbc.connect(connectionConfig);

connection2.query("SELECT naz_par FROM pos_par LIMIT 10", function (err, data) {
	if (err) {
		console.log('Error from connection!', err);
		return;
	}
	for (var i = 0; i < data.length; i++) {
		console.log(data[i].naz_par);
	}
});
// connection2 is now an open Connection

}
connectToDatabase();`

Here is the result:
image

When I switch Terminal to UTF-8 I got this:
image

And when run again program, here is the result:
image

I'm desperate and don't know how to resolve this.
I tried literally 70+% of the Stack Overflow before I ask here for your help, but without success.

Thank you in advance!
Stanko

2nd conn.Close causes process to exit.

I am testing latest 2.0.0 version - without connection Pool. Just connect and execute query.

My scenario:
(This issue looks like specific to Multiple DB Connections using 2 Promises. If I execute ONLY one query at a time, close connection is fine. No issues)

Two DB connections executing two queries at same times and Promise.All waits for both
Both queries complete and I am seeing results read from both.
First DB Connection close works without issues
Second DB conneciton.close command breaks the process - terminating and returning code: 3221225477
FYI; I commented out connection.close and any number of times I am able to run queries.. (This is without using any Connection Pool).

But my concern is if I don't call Close, too many connection will be left open and soon that will be problem.

Below is sample code I am using.. please let me know if you see any issues:

Current code uses callback approach - so left it that way

odbc.connect(connString, function(err, conn) {
conn.query(queryText, [], function(err, data) {
// Check errors etsc

// Close connection
// If I comment, I see no issues with 2 connection/query scenario
// When there is error, process exits and don't see any additional messages
conn.close((err) => {
if (err) {
console.log('executeODBCQuery: Connection closed. (ERROR)');
} else {
console.log('executeODBCQuery: Connection closed');
}
});
});
});

Not returning results properly on windows

connection.callProcedure() returns only the first char of each parameter
and
statement.execute() returns the same parameters that were passed in.

I have not tested this on ibm yet because of my other issue #38

unable to install on windows - 'BigInt': is not a member of 'Napi'

  • Windows 10 build 1903
  • Nodejs v8.14.1
  • node-gyp v5.0.3
  • python 2.7

Following is the output from powershell:

PS E:\myprojectpath\> npm install odbc --save

> [email protected] install E:\myprojectpath\\node_modules\odbc
> node-gyp configure build


E:\myprojectpath\\node_modules\odbc>if not defined npm_config_node_gyp (node "C:\Program Files\nodejs\node_modules\npm\node_modules\npm-lifecycle\node-gyp-bin\\..\..\node_modules\node-gyp\bin\node-gyp.js" configure build )  else (node "C:\Program Files\nodejs\node_modules\npm\node_modules\node-gyp\bin\node-gyp.js" configure build )
Building the projects in this solution one at a time. To enable parallel build, please add the "/m" switch.
  odbc.cpp
  odbc_connection.cpp
  odbc_statement.cpp
  dynodbc.cpp
  win_delay_load_hook.cc
E:\myprojectpath\\node_modules\odbc\src\odbc_connection.cpp(93): warning C4312: 'type cast': conversion from 'SQLUINTEGER' to 'SQLPOINTER' of greater size [E:\
git\sql-sync\node_modules\odbc\build\odbc.vcxproj]
E:\myprojectpath\\node_modules\odbc\src\odbc_connection.cpp(355): error C2039: 'BigInt': is not a member of 'Napi' [E:\myprojectpath\\node_modules\odbc\build\odb
c.vcxproj]
  E:\myprojectpath\\node_modules\node-addon-api\napi-inl.h(15): note: see declaration of 'Napi'
E:\myprojectpath\\node_modules\odbc\src\odbc_connection.cpp(355): error C3083: 'BigInt': the symbol to the left of a '::' must be a type [E:\myprojectpath\\node_
modules\odbc\build\odbc.vcxproj]
E:\myprojectpath\\node_modules\odbc\src\odbc_connection.cpp(355): error C2039: 'New': is not a member of 'Napi' [E:\myprojectpath\\node_modules\odbc\build\odbc.v
cxproj]
  E:\myprojectpath\\node_modules\node-addon-api\napi-inl.h(15): note: see declaration of 'Napi'
E:\myprojectpath\\node_modules\odbc\src\odbc_connection.cpp(355): error C2664: 'Napi::Reference<Napi::Object> Napi::Reference<Napi::Object>::New(const T &,uint
32_t)': cannot convert argument 1 from 'Napi::Env' to 'const T &' [E:\myprojectpath\\node_modules\odbc\build\odbc.vcxproj]
          with
          [
              T=Napi::Object
          ]
  E:\myprojectpath\\node_modules\odbc\src\odbc_connection.cpp(355): note: Reason: cannot convert from 'Napi::Env' to 'const T'
          with
          [
              T=Napi::Object
          ]
  E:\myprojectpath\\node_modules\odbc\src\odbc_connection.cpp(355): note: No user-defined-conversion operator available that can perform this conversion, or th
  e operator cannot be called
E:\myprojectpath\\node_modules\odbc\src\odbc_connection.cpp(861): error C2039: 'BigInt': is not a member of 'Napi' [E:\myprojectpath\\node_modules\odbc\build\odb
c.vcxproj]
  E:\myprojectpath\\node_modules\node-addon-api\napi-inl.h(15): note: see declaration of 'Napi'
E:\myprojectpath\\node_modules\odbc\src\odbc_connection.cpp(861): error C3083: 'BigInt': the symbol to the left of a '::' must be a type [E:\myprojectpath\\node_
modules\odbc\build\odbc.vcxproj]
E:\myprojectpath\\node_modules\odbc\src\odbc_connection.cpp(861): error C2039: 'New': is not a member of 'Napi' [E:\myprojectpath\\node_modules\odbc\build\odbc.v
cxproj]
  E:\myprojectpath\\node_modules\node-addon-api\napi-inl.h(15): note: see declaration of 'Napi'
E:\myprojectpath\\node_modules\odbc\src\odbc_connection.cpp(861): error C2664: 'Napi::Reference<Napi::Object> Napi::Reference<Napi::Object>::New(const T &,uint
32_t)': cannot convert argument 1 from 'Napi::Env' to 'const T &' [E:\myprojectpath\\node_modules\odbc\build\odbc.vcxproj]
          with
          [
              T=Napi::Object
          ]
  E:\myprojectpath\\node_modules\odbc\src\odbc_connection.cpp(861): note: Reason: cannot convert from 'Napi::Env' to 'const T'
          with
          [
              T=Napi::Object
          ]
  E:\myprojectpath\\node_modules\odbc\src\odbc_connection.cpp(861): note: No user-defined-conversion operator available that can perform this conversion, or th
  e operator cannot be called
E:\myprojectpath\\node_modules\odbc\src\odbc.cpp(537): error C2039: 'IsBigInt': is not a member of 'Napi::Value' [E:\myprojectpath\\node_modules\odbc\build\odbc.
vcxproj]
  E:\myprojectpath\\node_modules\node-addon-api\napi.h(190): note: see declaration of 'Napi::Value'
E:\myprojectpath\\node_modules\odbc\src\odbc.cpp(541): error C2039: 'BigInt': is not a member of 'Napi' [E:\myprojectpath\\node_modules\odbc\build\odbc.vcxproj]
  E:\myprojectpath\\node_modules\node-addon-api\napi-inl.h(15): note: see declaration of 'Napi'
E:\myprojectpath\\node_modules\odbc\src\odbc.cpp(541): error C2065: 'BigInt': undeclared identifier [E:\myprojectpath\\node_modules\odbc\build\odbc.vcxproj]
E:\myprojectpath\\node_modules\odbc\src\odbc.cpp(541): error C2672: 'Napi::Value::As': no matching overloaded function found [E:\myprojectpath\\node_modules\odbc
\build\odbc.vcxproj]
E:\myprojectpath\\node_modules\odbc\src\odbc.cpp(541): error C2974: 'Napi::Value::As': invalid template argument for 'T', type expected [E:\myprojectpath\\node_m
odules\odbc\build\odbc.vcxproj]
  E:\myprojectpath\\node_modules\node-addon-api\napi.h(265): note: see declaration of 'Napi::Value::As'
gyp ERR! build error
gyp ERR! stack Error: `"` failed with exit code: 1
gyp ERR! stack     at ChildProcess.onExit (C:\Program Files\nodejs\node_modules\npm\node_modules\node-gyp\lib\build.js:262:23)
gyp ERR! stack     at emitTwo (events.js:126:13)
gyp ERR! stack     at ChildProcess.emit (events.js:214:7)
gyp ERR! stack     at Process.ChildProcess._handle.onexit (internal/child_process.js:198:12)
gyp ERR! System Windows_NT 10.0.18362
gyp ERR! command "C:\\Program Files\\nodejs\\node.exe" "C:\\Program Files\\nodejs\\node_modules\\npm\\node_modules\\node-gyp\\bin\\node-gyp.js" "configure" "build"
gyp ERR! cwd E:\myprojectpath\\node_modules\odbc
gyp ERR! node -v v8.14.1
gyp ERR! node-gyp -v v3.8.0
gyp ERR! not ok
npm WARN [email protected] No description

npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! [email protected] install: `node-gyp configure build`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the [email protected] install script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.

npm ERR! A complete log of this run can be found in:
npm ERR!     C:\Users\myuser\AppData\Roaming\npm-cache\_logs\2019-09-05T19_24_44_524Z-debug.log

Node-Odbc on linux with Azure Always encrypted doesn't work for insert sql

Environment: Version of node-odbc is v6.13.4. It came with direct "npm install odbc". Also, node version is 10.19 running on linux setup.

  • Insert Query looks like below (I'm also including select query that works for Always encrypted columns). Note: "Always encrypted on" using Azure key vault is the way the DB columns are encrypted.

var db = require("odbc");
var connectionString = "Driver={ODBC Driver 13 for SQL Server};Server=<Server_name_here>;Database=<datbase_name_here>;Uid=<UID_name_here>;Pwd=<pwd_name_here>;Encrypt=yes;TrustServerCertificate=no;Connection Timeout=30;ColumnEncryption=Enabled;KeyStoreAuthentication=KeyVaultClientSecret;KeyStorePrincipalId=<ID_here>;KeyStoreSecret=<Secret_key_here>;"

//Select Query on encrypted DB column works! Able to receive decrypted data
const connection = await db.connect(connectionString);
const query = await connection.query ('SELECT * FROM Patients.Encrypted')
console.log(query); //Success

//Insert Query that works on non-encrypted column
const connection = await db.connect(connectionString);
const query = await connection.query ('INSERT INTO Patients.NonEncrypted (SSN, FirstName, LastName) VALUES(?, ?, ?)',['123','John','Doe'])
console.log(query); //Success

//Insert Query that fails on encrypted column
const connection = await db.connect(connectionString);
const query = await connection.query ('INSERT INTO Patients.Encrypted (SSN, FirstName, LastName) VALUES(?, ?, ?)',['123','John','Doe'])
console.log(query); //Failure [odbc] Error getting information about parameters

//Insert Query that fails on encrypted column with binding option (MS says it's mandatory to bind parameters)
const connection = await db.connect(connectionString);
const statement = await connection.createStatement();
await statement.prepare('INSERT INTO Patients.Encrypted (SSN, FirstName, LastName) VALUES(?, ?, ?)');
await statement.bind([{ sql_type: 1, value: '123-45-6789', precision: 0 }, { sql_type: 12, value: 'John', precision: 0 },{ sql_type: 12, value: 'Doe', precision: 0 }]);
const query = await statement.execute();
console.log(query); //Failure [odbc] Error binding parameters to the statement
await statement.close();

This is the summary of the issue. node-odbc library is able to pull from encrypted columns but I've not yet got success in push/insert to encrypted columns. Any guidance is much appreciated. I think I'm going wrong in creating correct parameters for bind but need someone to let me know how this is done.

https://docs.microsoft.com/en-us/sql/connect/odbc/using-always-encrypted-with-the-odbc-driver?view=sql-server-ver15

Empty error when 0 rows updated

First of all thank you a lot for all the support and implementation of this module.
Great work!

However I faced a bug and decided to share my thoughts about it with you.

So, the following code:

"use strict";

const odbc = require("odbc");

odbc.connect("Dsn=example;Uid=example;Pwd=example;", function (err, connection) {
    if (err) {
        return console.error(err);
    }
    connection.query("UPDATE some_entity SET value = 10000 WHERE book = (SELECT id FROM books WHERE name = 'Best book of the world')", function (error, results) {
        if (error) {
            return console.error(error);
        }
        console.log(results);
    });
});

Shows the following output on the terminal:
[Error: "errors": []]

This error is shown when I execute an UPDATE query that the WHERE criteria does not match any rows, if it matches at least 1 row, there's no error...
I don't know if this is the correct behaviour although I don't agree it should be.
Imagine the following situation: I want to execute an UPDATE, and if no rows were updated on that same query, I want to be able to execute an INSERT query instead.

Currently using version 2.0.0-4.

Handle drivers that return incorrect ColumnSize for SQL_TYPE_TIME and SQL_TYPE_DATE

Per #60, some drivers (e.g. 4D ODBC driver) do not return the correct ColumnSize for certain data types, particularly SQL_TYPE_TIME and SQL_TYPE_DATE. These data types should return 10 and at least 8 respectively, and so the program should do a little sanity checking to make sure that the values returned are within acceptable bounds. If not, it should override the driver output.

(Note: #60 was closed because I took too long to get to it, a lot of the changes were already fixed in another branch as part of 2.3.0)

I don't understand the reasoning behind mixing rows and result metadata

Hi Mark,

As per the heading, can you please explain what the advantage of this pattern is:

[ { colnm1: "value1", colnm2: "value2" },
{ colnm1: "value1", colnm2: "value2" },
statement: "select bla bla.."
return: ...
columns: [ {etc} ]]

With respect - this makes no sense to me. While I see the value in returning the metadata in a general sense, surely the aim of the library would be better served by including the row objects in its own tuple, for example:

{
   rows: [ { col1: val1, col2: val2 }. { another row }, {etc} ],
   statement: ...,
   ...,
   columns : [ {col1}, {col2} ] 
}

The way you've implemented it means that one needs to iterate over EACH member of the array and do a typeof check to determine that it is indeed a row - and this is unneccessary and slow. You can't "at a glance" determine that your result set is empty (since there is no row array to have its length queried). One would need to extract each row object and add it to a new array if you want to pass the data to a function which expects an array of rows objects.

Alternatively, you could delete the parameters/count/columns objects from the array, but this would introduce a potential regression if ever you were to add another tuple to the array with an obviously unknown name.

The only way to really fix this is to monkeypatch a hook over each DB call to format the result object into a more usable state, but when your SQL is reoutinely returning 100,000+ records - that kind of thing rapidly becomes a needless performance bottleneck.

I've been beating myself up trying to figure out what I'm missing here, but....

Thanks in advance!
Rhianne

"Communication link failure" state:"08S01" on a connection pool

Hi,
I have question about using connectionpool on my server.
I have defined a global variable named connection pool and when I want to query database I use it to query. that works fine but there is a problem that I think after a while being idle the server closes the connections and I face this error. "Communication link failure" state:"08S01" which the cause is explained here.
https://www.ibm.com/support/pages/odbc-client-connections-dropping-intermittently-error-communication-link-failure

Is there any way to workaround this problem in node-odbc?

Allow named bind parameters

Feature for named bind parameters:

Definition:

.bind(parameters, callback?)

parameters: An array of values or an object with named properties to bind to the SQL statement previously prepared. All parameters will be input parameters. When passing an array, the number of values passed in the array must match the number of parameters to bind to in the prepared statement. When passing an object each property name should match a bound parameter in the prepared statement. Each property value will be used as the bound parameter value.

Example:

const odbc = require('odbc');
 
// can only use await keyword in an async function
async function bindExample() {
    const connection = await odbc.connect(`${process.env.CONNECTION_STRING}`);
    const statement = await connection.createStatement();
    await statement.prepare('INSERT INTO MY_TABLE VALUES(:id, :name)');
    // Assuming MY_TABLE has INTEGER and VARCHAR fields.
    await statement.bind({ id: 1, name: 'Name' });
    // statement has been prepared and values bound, can now execute
}
 
bindExample();

Add Prebuilt binaries

It would be awesome if we had pre-built binaries instead of building from source each time.

callProcedure crashes with Sybase ASE

OS: Ubuntu 18.04.3 LTS
unixODBC: 2.3.4
node-odbc: 2.2.2
Node: 10.17.0
Sybase ASE: 16.0 SP03 (Express Edition is available to download for free, so you can test)

Test program:

const odbc = require('odbc');

async function main() {
  try {
    const connection = await odbc.connect(
      'Driver=/opt/sap/DataAccess64/ODBC/lib/libsybdrvodb-sqllen8.so;' +
      'Server=ubuntu-test;' +
      'Port=5000;' +
      'Database=pubs2;' +
      'CharSet=ServerDefault;' +
      'UserID=testuser;' +
      'Password=testuser');

    const results = await connection.callProcedure(null, null, 'byroyalty', [50]);

    for (const result of results) {
      console.log(result);
    }
    
    connection.close();
  } catch (error) {
    console.error(error);
  }
}

main();

Produces a segfault:

Segmentation fault (core dumped)

Stacktrace:

#0  0x00007f9accef3639 in QueryData::clear (this=<optimized out>) at ../src/odbc.h:163
#1  QueryData::~QueryData (this=<optimized out>, __in_chrg=<optimized out>) at ../src/odbc.h:132
#2  CallProcedureAsyncWorker::~CallProcedureAsyncWorker (this=0x3743e00, __in_chrg=<optimized out>)
    at ../src/odbc_connection.cpp:1165
#3  CallProcedureAsyncWorker::~CallProcedureAsyncWorker (this=0x3743e00, __in_chrg=<optimized out>)
    at ../src/odbc_connection.cpp:1166
#4  0x00007f9accee8ee5 in Napi::AsyncWorker::Destroy (this=0x3743e00)
    at /home/dezsiszabi/sybase-node-test/node_modules/node-addon-api/napi-inl.h:3633
#5  Napi::AsyncWorker::OnWorkComplete (status=<optimized out>, this_pointer=0x3743e00)
    at /home/dezsiszabi/sybase-node-test/node_modules/node-addon-api/napi-inl.h:3741
#6  0x00000000009061cc in (anonymous namespace)::uvimpl::Work::AfterThreadPoolWork(int) ()
#7  0x0000000000a6b125 in uv__work_done (handle=0x264d790 <default_loop_struct+176>)
    at ../deps/uv/src/threadpool.c:313
#8  0x0000000000a6f4cf in uv__async_io (loop=0x264d6e0 <default_loop_struct>, w=<optimized out>, 
    events=<optimized out>) at ../deps/uv/src/unix/async.c:118
#9  0x0000000000a80d78 in uv__io_poll (loop=loop@entry=0x264d6e0 <default_loop_struct>, timeout=-1)
    at ../deps/uv/src/unix/linux-core.c:379
#10 0x0000000000a6fe0b in uv_run (loop=0x264d6e0 <default_loop_struct>, mode=UV_RUN_DEFAULT)
    at ../deps/uv/src/unix/core.c:364
#11 0x0000000000904595 in node::Start(v8::Isolate*, node::IsolateData*, std::vector<std::string, std::allocator<std::string> > const&, std::vector<std::string, std::allocator<std::string> > const&) ()
#12 0x000000000090281f in node::Start(int, char**) ()
#13 0x00007f9acf92db97 in __libc_start_main (main=0x8bbd20 <main>, argc=2, argv=0x7ffcfb606428, 
    init=<optimized out>, fini=<optimized out>, rtld_fini=<optimized out>, stack_end=0x7ffcfb606418)
    at ../csu/libc-start.c:310
#14 0x00000000008bbe55 in _start ()

The offending line is in odbc.h, line 163:

if (parameter = this->parameters[i], parameter->ParameterValuePtr != NULL) {

When adding some print statements in odbc.h and recompiling the code it seems that numParameters is 2 and this->parameters length is only 1, so the code will try to access an element outside the array.

For reference, the byroyalty stored procedure that the code is trying to run is the following (it's one of the sample stored procedures that was installed during Sybase installation):

create or replace procedure byroyalty @percentage int
as
select au_id from titleauthor
where titleauthor.royaltyper = @percentage

The titleauthor table has the following schema:

CREATE TABLE "dbo"."titleauthor"
(
  au_id varchar(11) NOT NULL,
  title_id varchar(6) NOT NULL,
  au_ord tinyint,
  royaltyper int
)

I think you'll probably have to set up a Sybase ASE Express Edition to be able to reproduce this. (Unless you see something obviously wrong with the code I gave above,)

I'll try to debug further and understand the library code a bit more.

Critical: 2.0 version data read: Float and Date values from same queries are not showing up same as Version 1.4.6

I have two instances running on same machine, pointing to same DB and SAME code-base - ONLY exception is one uses ODBC 1.4.6 and other uses ODBC 2.0.0.

I am doing this to test new version. Performance is consistently 50% better even when I tested iwht 10MB of data returned through queries.

Issue: Main issue is FLOAT values returned by ODBC 2.0.0 driver are not correct. All other values match including dates/strings/numbers on same record..

Below is example:

Float samples
ODBC 1.4.6: [507.365, 504.167]
ODBC 2.0.0: [1.83187975902498e-76, 4.2778154908686205e-33]

Date samples : (Issue with dates is HOURS is always showing in LOCAL/EST with 2.0 version)
ODBC 1.4.6: [2016-02-22T05:29:41.000Z]
ODBC 2.0.0: [2016-02-22 00:29:41.0000000] - Please see how ONLY hours part is wrong

ODBC 1.4.6: "2017-02-06T19:29:51.000Z"
ODBC 2.0.0: "2017-02-06 14:29:51.0000000"

NOTE: As you see difference here is HOURS is -5H difference from OLDER version. Suggesting that new 2.0.0 version is converting always values to LOCAL/EST timezone

Above are sample from one or two records but this is the case throughout.

Environment: Node 8.16.0 [for both on same machine and pointing to same DB/queries etc]

This is critical issue as data needs to be 100% accurate with new version. Really appreciate any help on this.

Losing precision of decimal types

If I execute a query where I cast a value to a decimal type with precision specified, like CAST something AS DECIMAL(18,2), it loses the specified precision and just casts it to DECIMAL (with no precision specified).

odbcErrors 30126, 30207 on Select when nothing found

Based on parameters passed if there is data returned from the database it succeeds but if we pass parameters where there is no data it errors with odbcErrror codes 30126, 30207. The error message passed back is very vague and generic "Error executing the statement".

Example Query:
SELECT *
FROM tableName
WHERE PART = ?;

If the part is found it returns data successfully and there are no issues. If the part is not found instead of returning an empty array it throws the error above. I am running version 2.2.2. I can't even find any information about these error codes and what they mean?

Opaque Error Messages

Hello. The error messages from ODBC are quite uninformative. SQL syntax errors are very vague and don't describe where the error occurred or hint to what may be the problem.

Example error

Error: [odbc] Error executing the sql statement

How can we access more information about the errors?

select with character 'ñ'

Hello I’have update odbc from 1.4.5 to 2.1.3 and I have problem to do select with col with character ‘ñ’. I using pool connection now but the same select in version 1.4.5 without pool it Works fine but with 2.1.3 and pool has the next error
select * from "Table" where "año"='${anho}' and "fecha"='${fecha}'

'[Pervasive][ODBC Client Interface][LNA][Pervasive][ODBC Engine Interface]Invalid column name: 'año'.' },
{ state: '42S22',
code: 0,

is there some option to put an encoding or something like this?

Thank you.

SQL_TINYINT Suppport

No switch statement for SQL_TINYINT when binding data either into or out of the database. This means that fields that map to SQL_TINYINT are returned as strings, which isn't the desire behavior. Add SQL_TINYINT support to the package.

Pool maxSize documented but not implemented

First of all - great package :). This has worked nicely to connect to an iSeries database.

The maxSize property of the pool configuration is documented but does not appear to be implemented. It appears that new connections will be opened with no limits. We would love to have the maxSize property implemented where query would wait until a free connection is available. Short of that, it would probably be good to indicate that maxSize is no implemented in the documentation.

memory problem with DATETIME

Hello! Thank you a lot for this module, but there's a little problem, that I can't resolve.
Electron + node 8.9.3 + odbc 1.4.5 (newest version doesnt work with electron because of node version restriction), I succesfully connected to Interbase 2009 database, but there's a problem, when I'm try to get datetime field - application just consume about 2Gb of memory and crashes. Just SELECT, nothing else.
No problem with other types - string and numbers returning into result pretty fine, but looks like there's a problem with datetime. Not sure if everything ok with code, but how it goes now:

let odbc = window.require('odbc')()
let constr = 'DRIVER=Firebird/InterBase(r) driver; UID=SYSDBA; PWD=masterkey; DBNAME=localhost:c:\\Firebird\\database.ibs;charset=win1251'
let connection = odbc.openSync(constr);
let query = "select S_DURATION from SPR where S_KEY = 464"
let stmt = odbc.querySync(query)

I have no idea, why this happens. Does it has any solution or suggestion about it and how to fix it?

Thanks for your time a lot.

UPD: solved it with cast datatypes, yet the problem with timestamp stay the same.

SQL Data Types

Hello,

I tested the same query (column discontinued has data type BIT) on both version 1.4.5 and 2.0.0 and it seems v1 can recognize BIT but not v2; any plan to support it in the future?

1.4.5

[ { 'product-code': '171ZFS27', discontinued: false },
  { 'product-code': '75131-TRI', discontinued: false },
  { 'product-code': 'X18096-GALV', discontinued: true },
  { 'product-code': 'DS42152', discontinued: false },
  { 'product-code': 'AA76520', discontinued: false },
  { 'product-code': 'X16420-GREE', discontinued: true },
  { 'product-code': 'X16420-WHIT', discontinued: false },
  { 'product-code': '37404-WHIT', discontinued: false },
  { 'product-code': 'D76480', discontinued: false },
  { 'product-code': '173HFX04', discontinued: true } ]

2.0.0-4

[ { 'product-code': '171ZFS27', discontinued: '0' },
  { 'product-code': '75131-TRI', discontinued: '0' },
  { 'product-code': 'X18096-GALV', discontinued: '1' },
  { 'product-code': 'DS42152', discontinued: '0' },
  { 'product-code': 'AA76520', discontinued: '0' },
  { 'product-code': 'X16420-GREE', discontinued: '1' },
  { 'product-code': 'X16420-WHIT', discontinued: '0' },
  { 'product-code': '37404-WHIT', discontinued: '0' },
  { 'product-code': 'D76480', discontinued: '0' },
  { 'product-code': '173HFX04', discontinued: '1' },
  statement: 'SELECT TOP 10 "product-code", "discontinued" FROM pub.product',
  parameters: [],
  return: undefined,
  count: -1,
  columns: [ { name: 'product-code', dataType: 12 },
    { name: 'discontinued', dataType: -7 } ] ]

node-gyp compile failure during install

Receiving the error below while attempting to install on IBM i. Also tried using node v10 instead of node v12, similar error with dependency on odbc.o.d.raw. What am I missing? unixodbc and unixodbc-devel are installed (2.3.4-3), i access ODBC driver is installed, and ‘Development tools’ are installed.

gmake: Entering directory '/seneca/odbctest/node_modules/odbc/build'
CXX(target) Release/obj.target/odbc/src/odbc.o
In file included from /home/LUANN/.cache/node-gyp/12.13.0/include/node/node_api.h:10:0,
from /seneca/odbctest/node_modules/node-addon-api/napi.h:4,
from ../src/odbc.cpp:19:
/home/LUANN/.cache/node-gyp/12.13.0/include/node/js_native_api.h:39:22: fatal error: opening dependency file ./Release/.deps/Release/obj.target/odbc/src/odbc.o.d.raw: No such file or directory
#define EXTERN_C_END }

We had successfully installed on another server for dev/test and are now trying to install on the clients prod server so the app can be deployed there. Can't figure out what is different?

Process crashes while connecting to Database on Node v11.15.0

Hi,
All worked fine on Windows with Node version v10.14.2.
I did fresh install of Node v11.15.0 on Linux box and tested project.

When I try to open DB connection, process exists with below failure:
error Command failed with signal "SIGSEGV".

Is it possible it is specific to Node version I am using.
Anyone tested 2.0 with above exact version or seeing similar issues?

Calling Stored Procedures Issue

I keep getting Stored Procedure doesn't Exit in DB2.

Create or Replace Function Test() Returns VARCHAR(4000)
Return ('Test');

const connection = await odbc.connect("DSN=MYDSN");
const result = await connection.callProcedure(null, null, 'TEST', [undefined]);
console.log(result);

(node:2235) UnhandledPromiseRejectionWarning: Error: Error: [Node.js::odbc] CallProcedureAsyncWorker::Execute: Stored procedure TEST doesn't exit
at Object. (.../odbc/lib/Connection.js:114:20)

Working with DBF file

I have a DBF file that contain "thai" character encoded with TIS-620. I am currently able to read the file but the thai characters were not decoded correctly (� got this) . Can anyone please guide me on how to fix this. Thank you!

New feature request: streaming results

I would propose a way to 'stream' the query result without load all results at once.
It's mandatory for query that returns many many records...

In the previus versione of odbc, I use the db.queryResult function that return a delegate with the .fetch method...
Something similira to this should be perfect.

Thanks,
Alessio

connection.Close causes exit process with code: 3221225477.

I am testing latest 2.0.0 version - without connection Pool. Just connect and execute query.

My scenario:

  1. Two DB connections executing two queries at same times and Promise.All waits for both
  2. Both queries complete and I am seeing results read from both.
  3. First DB Connection close works without issues
  4. Second DB conneciton.close command breaks the process - terminating and returning code: 3221225477

FYI; I commented out connection.close and any number of times I am able to run queries.. (This is without using any Connection Pool).

But my concern is if I don't call Close, too many connection will be left open and soon that will be problem.

Below is sample code I am using.. please let me know if you see any issues:

  • Current code uses callback approach - so left it that way

  • odbc.connect(connString, function(err, conn) {
    conn.query(queryText, [], function(err, data) {
    // Check errors etsc

    // Close connection
     **conn.close**((err) => {
      		if (err) {
      			console.log('executeODBCQuery: Connection closed. (ERROR)');
      		} else {
      			console.log('executeODBCQuery: Connection closed');
      		}
      	});
    

    });
    });

Error: not found: gmake

Getting gmake not found on ibm when running "npm install odbc".

gyp ERR! build error
gyp ERR! stack Error: not found: gmake
gyp ERR! stack at getNotFoundError (/QOpenSys/pkgs/lib/nodejs10/lib/node_modules/npm/node_modules/which/which.js:13:12)
gyp ERR! stack at F (/QOpenSys/pkgs/lib/nodejs10/lib/node_modules/npm/node_modules/which/which.js:68:19)
gyp ERR! stack at E (/QOpenSys/pkgs/lib/nodejs10/lib/node_modules/npm/node_modules/which/which.js:80:29)
gyp ERR! stack at /QOpenSys/pkgs/lib/nodejs10/lib/node_modules/npm/node_modules/which/which.js:89:16
gyp ERR! stack at /QOpenSys/pkgs/lib/nodejs10/lib/node_modules/npm/node_modules/isexe/index.js:42:5
gyp ERR! stack at /QOpenSys/pkgs/lib/nodejs10/lib/node_modules/npm/node_modules/isexe/mode.js:8:5
gyp ERR! stack at FSReqWrap.oncomplete (fs.js:153:21)

Database results failing to parse correctly

It looks like I am able to successfully execute queries, however, the results are coming back really funky. I have toyed a lot with the SQL statement, and have rewritten the entire query a few times using nested callbacks, and connection.query(...) syntax as well but it doesn't seem to change the results any.

'use strict'

const { series } = require('gulp')
const db = require('odbc')
const asyncForEach = require('../../../src/lib/asyncForEach')

const synchronize = async done => {
  try {
    console.log('connecting to ODBC driver')

    const conn = await db.connect(`DSN=TS_CAE_STD;uid=${process.env.USER};pwd=${process.env.PASS};`)
    console.log('connected ... creating a statement')

    const stmt = await conn.createStatement()
    console.log('statement created ... preparing statement')

    const q = "SELECT TOP 5 x.job, x.cost_code, x.transaction_date, x.accounting_date, x.period_end_date, x.description, x.units, x.employee, x.pay_id, x.date_stamp, x.time_stamp FROM JCT_CURRENT__TRANSACTION AS x WHERE x.job <> '' AND x.cost_code <> '' AND x.units <> 0 AND x.employee <> '' AND x.pay_id <> '' ORDER BY x.date_stamp ASC, x.time_stamp ASC;"
    await stmt.prepare(q)
    console.log('statement prepared ... executing statement')

    const rows = await stmt.execute()
    console.log('statement executed ... looping results')

    await asyncForEach(rows, row => {
      console.log(row)
    })
    console.log('results looped ... closing connection')

    await conn.close()
    console.log('connection closed ... ending task')
    done()
  } catch (err) {
    done(err)
  }
}

module.exports = series(synchronize)

Console Output Results:

[16:48:29] Using gulpfile ~\Documents\dev\sage-scraper\gulpfile.js
[16:48:29] Starting 'synchronizations:equalityFund:transactions'...
[16:48:29] Starting 'synchronize'...
connecting to ODBC driver
connected ... creating a statement
statement created ... preparing statement
statement prepared ... executing statement
statement executed ... looping results
{ '': '07:46:54', '�\u0007': '14691' }
{ '': '07:46:54', '�\u0007': '19983' }
{ '': '07:46:56', '�\u0007': '10644' }
{ '': '07:46:56', '�\u0007': '7587' }
{ '': '07:46:56', '�\u0007': '14691' }
results looped ... closing connection
connection closed ... ending task
[16:48:58] Finished 'synchronize' after 29 s
[16:48:58] Finished 'synchronizations:equalityFund:transactions' after 29 s
Done in 29.75s.

Below is a screenshot of the same SQL query using Microsoft Access as a query tool (database is not Access) and the same DSN
example1

Error binding parameters on stored procedure call

Hello! I am running into an error and am having trouble getting enough information to troubleshoot it...
Error: [odbc] Error binding parameters to the procedure

I am running this on an IBMi with the following driver...

[IBM i Access ODBC Driver]                   
Description=IBM i Access ODBC Driver         
Driver=/QOpenSys/pkgs/lib/libcwbodbc.so      
Threading=0                                  
DontDLClose=1                                
UsageCount=1

My javascript code looks like this...
let result = await connection.callProcedure(null, null, 'GPR5016SP', ['RE', 'Y', '', 'GPACMID', '', '', '', 0, 0]);

Our system is automatically building the library list for me, so I get the same error whether I qualify the library or not...
let result = await connection.callProcedure(null, 'DUNC', 'GPR5016SP', ['RE', 'Y', '', 'GPACMID', '', '', '', 0, 0]);

This is the definition of the stored procedure...

	INOUT GACT CHAR(2) , 
	IN IGNO CHAR(1) , 
	IN CUSR CHAR(10) , 
	IN NAME CHAR(10) , 
	IN LIBRARY CHAR(10) , 
	IN "TYPE" CHAR(10) , 
	IN CHARVALUE VARCHAR(2000) , 
	IN DECIMALVALUE DECIMAL(31, 9) , 
	IN INCREMENTVALUE DECIMAL(31, 9) ) 
	DYNAMIC RESULT SETS 1 
	LANGUAGE RPGLE 
	SPECIFIC DUNC/GPR5016SP 
	NOT DETERMINISTIC 
	MODIFIES SQL DATA 
	CALLED ON NULL INPUT 
	EXTERNAL NAME 'DUNC/GPR5016P' 
	PARAMETER STYLE GENERAL ; 
  
COMMENT ON SPECIFIC PROCEDURE DUNC/GPR5016SP 
	IS 'Dynamic Data Area Access' ; 
  
LABEL ON SPECIFIC PROCEDURE DUNC/GPR5016SP 
	IS 'Dynamic Data Area Access' ; 
  
GRANT EXECUTE   
ON SPECIFIC PROCEDURE DUNC/GPR5016SP 
TO PUBLIC ; 
  
GRANT ALTER , EXECUTE   
ON SPECIFIC PROCEDURE DUNC/GPR5016SP 
TO QPGMR WITH GRANT OPTION ;

I am able to run simple SQL statements. I've tried insert, update, delete, and select, but I have not tried SQL with functions yet. So far SQL works fine.

I can also run this same SQL and stored procedure call when running node on my PC. The driver I'm using from my pc is...
iSeries Access ODBC Driver

Our system admin does not see any attempt to make a call when looking through job logs. So, it appears that something within the driver or the odbc code is throwing this error.

What can we do to get more information and narrow down a cause? Is there any specific configuration we should look at? Am I making the javascript call incorrectly?

Thank you for your help!

Install odbc error

Hello,
I am trying to install the package but npm i odbc fail with the following message:
Error_ODBC

I am on Windows 10 , vs code 1.35.0, python 2.7, node v10.15.3
Thanks in advance for your time,
Best regards.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.