GithubHelp home page GithubHelp logo

dbffile's People

Contributors

acdibble avatar dependabot[bot] avatar diegonc avatar khaos66 avatar kinolaev avatar lordrip avatar meilechwieder avatar merik-chen avatar paypacadam avatar troyvgw avatar wasenshi123 avatar workingacry avatar wseng avatar yortus avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dbffile's Issues

Cannot update an specific field in a row.

Hi mate, I am having some troubles about updating a specific field in the row, please check this example and help me.

the file has one row, I only want to update this two specific field in the first row, I will be waiting for your comments

var rows = [{
ULTSOCIO:'3223',
ULTAHORROS: '8889'
}];

DBFFILE.open('NUMETRAN.DBF')
.then(dbf => {

dbf
return dbf.set(rows);

})
.catch(err => console.log('An error occurred: ' + err));

Error on read large file

Hi, this error is happening when i try to read a large file (36k records)

{ [Error: EINVAL: invalid argument, read] errno: -4071, code: 'EINVAL', syscall: 'read' }

i cant find anything on google :(

Thai Character Support

Hi, greeting from Thailand :)

Can you please guide me to how to make this work with thai characters?
Everything else works perfect though.

Example doesn't work

var DBFFile = require('dbffile');

DBFFile.open('[full path to .dbf file]')
.then(function (dbf) {
  console.log('DBF file contains ' + dbf.recordCount + ' rows.');
  console.log('Field names: ' + dbf.fields.map(function (f) { return f.name; }).join(', '));
  return dbf.readRecords(100);
})
.then(function (rows) {
  rows.forEach(function (row) {
    console.log(row);
  });
})
.catch(function (err) {
  console.log('An error occurred: ' + err);
});

First callback never called.

open client side

Hi, how i can to open dbf file in client side, without upload?

Problem with Node.js version 11

@yortus
when try to add this to project, it throws an error

import { DBFFile } from 'dbffile';

import { DBFFile } from 'dbffile';
^

SyntaxError: Unexpected token {
at new Script (vm.js:85:7)
at createScript (vm.js:266:10)
at Object.runInThisContext (vm.js:314:10)
at Module._compile (internal/modules/cjs/loader.js:698:28)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:749:10)
at Module.load (internal/modules/cjs/loader.js:630:32)
at tryModuleLoad (internal/modules/cjs/loader.js:570:12)
at Function.Module._load (internal/modules/cjs/loader.js:562:3)
at Module.require (internal/modules/cjs/loader.js:667:17)
at require (internal/modules/cjs/helpers.js:20:18)
[nodemon] app crashed - waiting for file changes before starting...

Unsupported version

Hi.

I'm really interested in your parser, because it allows to define each field length individually. I need to append new registers to an existing database. However, the parser gives me the following message: "AssertionError: File 'FILENAME.DBF' has unknown/unsupported dBase version: -11.".

Is it possible to make it work?

Thank you very much.

Issue using with node project.

I get the following error when trying to use the module

`App threw an error during load
C:\Users\desktop\testapp\main.js:12
import {DBFFile} from 'dbffile';
^^^^^^

SyntaxError: Cannot use import statement outside a module`

I am a beginner so I am sure this is user error. What is the proper way to import this project?

Can I update a specific row?

I want to be able to update a specific row.

If updating is for some reason impossible, how about every time we update a row, we just replace all file content of with new updated content? we have the descriptors, we have everything in memory, we just delete the old and create a new with all existing records and the updated ones. Can this feature be built in to the library?

Also what is the file limit this library can handle. Does it just depend on my devices resources?

I have and error when i tried the example

index.ts:14:64 - error TS2345: Argument of type '{ name: string; type: string; size: number; }[]' is not assignable to parameter of type 'FieldDescriptor[]'.
Type '{ name: string; type: string; size: number; }' is not assignable to type 'FieldDescriptor'.
Types of property 'type' are incompatible.
Type 'string' is not assignable to type '"C" | "N" | "F" | "L" | "D" | "I" | "M" | "T" | "B"'.

14 let dbf = await DBFFile.create('<full path to .dbf file>', fieldDescriptors);

readRecords will occasionally throw a RangeError

I'm using this library to convert some old DBF data, and I'm running into an issue where readRecords(n) will throw RangeErrors. Currently I've just wrapped readRecords in a try/catch and only read one record at a time, which works, but naturally some data is lost.

Full stack trace:

RangeError [ERR_BUFFER_OUT_OF_BOUNDS]: Attempt to access memory outside buffer bounds
    at boundsError (internal/buffer.js:80:11)
    at Buffer.readInt32LE (internal/buffer.js:386:5)
    at int32At (/home/kurtis/Documents/anthology-converter/node_modules/dbffile/dist/dbf-file.js:264:72)
    at readRecordsFromDBF (/home/kurtis/Documents/anthology-converter/node_modules/dbffile/dist/dbf-file.js:345:35)
    at async Object.module.exports.convert (/home/kurtis/Documents/anthology-converter/src/converters/custconverter.js:31:17) {
  code: 'ERR_BUFFER_OUT_OF_BOUNDS'
}

Unfortunately I can't attach the files in question as they contain confidential/private data. However, I can share some metadata about the file:

  • Version: 48
  • # records: 23k
  • The file contains the Y (money) column type, which is currently unsupported

Thanks in advance for the help.

Support stream?

Does the library support stream eg

const readStream = fs.createReadString('\path to dbf file")
readStream.pipe('process a row in the dbf file")

when i append record including chinese get a error says TypeError: "value" argument is out of bounds

let fieldDescriptors = [
        {
            name: 'C_ID',
            type: 'C',
            size: 16,
        },
        {
            name: 'C_HM',
            type: 'C',
            size: 64,
        }, {
            name: 'C_DZ',
            type: 'C',
            size: 64,
        },
        {
            name: 'N_Y',
            type: 'N',
            size: 4,
        },
        {
            name: 'N_M',
            type: 'N',
            size: 4,
        },
        {
            name: 'D_CB',
            type: 'D',
            size: 8,
        },
        {
            name: 'N_BCCM',
            type: 'N',
            size: 8,
        },
        {
            name: 'N_SCCM',
            type: 'N',
            size: 8,
        },
        {
            name: 'I_CBBZ',
            type: 'N',
            size: 1,
        }
    ]
{
                C_ID: '',
                C_HM: item.customerNumber,
                C_DZ: String("测试"),
                N_Y: Number(dateformat(new Date(item.logDateTime), 'yyyy')),
                N_M: Number(dateformat(new Date(item.logDateTime), 'mm')),
                D_CB: new Date(item.logDateTime),
                N_BCCM: Number((item.number/1000).toFixed()) || 0,
                N_SCCM: 0,
                I_CBBZ: 1
            }

Add support to edit a record

Hi @yortus , I have an use case in which I need to edit a record, do you think that this could be useful for the library?

In a positive case, I'm thinking that maybe we could return an array of some sort of records objects that could have their own offset. This way we can edit an object and then just call .update() over it.
f.i.

const dbf = DBFHandlerFile('dbf-path'); // please note that this is a different class
const records = dbf.getAll(); // this cold return an array of DBFRecordHandler

console.log(records[0]); // prints { field1: 10, field2: 'text-field' }
records[0].update('field1', 20); // this could throw an exception if the field doesn't exist
records[0].save(); // save the record

What do you think about? (sorry if this looks like a lot of work)

Add support for Visual FoxPro tables & DateTime fields

Hi! thanks for this nice library. I've wanted to ask, if is possible to add support to Visual FoxPro tables and DateTime fields. Currently I'm working on migrating some local VFP data to a web server. I'm working on it, can I make a PR for your consideration when is done?

Thanks in advanced!

FoxPro9 Memo fields are 4 bytes in size

When trying to open a vfp9 DBF, which contains at least one field of type Memo, then this lib throws this error:
Invalid field size (must be 10)

The size 10 is hard-coded inside field-descriptor.ts

if (type === 'M' && size !== 10) throw new Error('Invalid field size (must be 10)');

I'm not sure if this is different from other dBase DBFs, but as far as I know vfp9 always uses the size of 4 bytes for Memo fields

If this is different, maybe there could be a dialect option?

Long text fields breaking code

I had an issue with a super-old (from 90s) dbf file. Developer created a 256 characters long text field. I didn't notice the real issue since I was using loose read-mode.

I've just changed a few lines to walk around it. It's a little quick and dirty solution but it works for me. I'll post a pull request just in case anyone needs it.

Numeric with decimalPlaces

When amount has decimal places for example 175,50 the created dbf file is empty, if the amount is 175.00 the file is not empty.

Tried
{ name: 'amount', type: 'N', size: 19 },
{ name: 'amount', type: 'N', size: 19, decimalPlaces: 2 },

Error: Cannot create a string longer than 0x1fffffe8 characters

I read a bunch of DBF files successfully using your library (thanks for that!). However, I get an error on certain files

(node:79490) UnhandledPromiseRejectionWarning: Error: Cannot create a string longer than 0x1fffffe8 characters
    at Object.slice (buffer.js:616:37)
    at Buffer.toString (buffer.js:804:14)
    at SBCSDecoder.write (/Users/MyUser/Code/ProjectName/node_modules/iconv-lite/encodings/sbcs-codec.js:68:19)
    at Object.decode (/Users/MyUser/Code/ProjectName/node_modules/iconv-lite/lib/index.js:42:23)
    at readRecordsFromDBF (/Users/MyUser/Code/ProjectName/node_modules/dbffile/dist/dbf-file.js:307:52)

This is more info about the old file (from a 1991 MS DOS application) using the python dbf package:

Table:         Test.DBF
Type:          dBase III Plus
Codepage:      ascii (plain ol' ascii)
Status:        DbfStatus.CLOSED
Last updated:  2020-05-23
Record count:  240
Field count:   18
Record length: 166
--Fields--
    0) nr N(4,0)
    1) klantnr N(4,0)
    2) faktype C(1)
    3) datum D
    4) tekst M
    5) netto N(7,0)
    6) btw N(7,0)
    7) totaal N(7,0)
    8) medecon N(7,0)
    9) netto1 N(7,0)
    10) netto2 N(7,0)
    11) netto3 N(7,0)
    12) netto4 N(7,0)
    13) netto5 N(7,0)
    14) uitvoer N(7,0)
    15) uitvoerneg N(7,0)
    16) betkode C(1)
    17) diversen C(60)

(I can send it if needed, 40kB).

I used this snippet of code to reproduce the error:

import { DBFFile } from "dbffile";

(async function test() {
  const filePath = "Test.DBF";
  const dbf = await DBFFile.open(filePath, { encoding: "ascii" });

  const records = await dbf.readRecords(10);
})();

Also tried no encoding, and CP850 encoding and some other options, but not sure if it's encoding related? Providing no encoding at all (default latin) worked for the other files of the same database.

Memo file not found for file '${path}'

Hi all,

thanks for the work so far.

If have an issue regarding case-sensititvity of input files:
Unfortuneatly I have 2 cases of filenames:

table1.dbf
table1.dbt

TABLE2.DBF
TABLE2.DBT

But I can't copy or mutate clients data.

Currently I just patched the package:

if (fileVersion === 0x83 || fileVersion === 0x8b) {
    memoPath = path.slice(0, -path_1.extname(path).length) + '.dbt';
    let isMemoFileMissing = await utils_1.stat(memoPath).catch(() => 'missing') === 'missing';
    if (isMemoFileMissing){
        memoPath = path.slice(0, -path_1.extname(path).length) + '.DBT';
        isMemoFileMissing = await utils_1.stat(memoPath).catch(() => 'missing') === 'missing';
    }
    if (isMemoFileMissing)
        memoPath = undefined;

Any better solutions?

Regards,
Acry

Missing the first character of value

Hi,
I'm really surprised that I found a dbf reader under nodejs. :) Thanks!
But when I tested I got wrong results some fields.
The first character missed. The DBF encoding is CP-852.
(I did not test to create or modify the records, because I need to read from old dbs only.)

{ NEV: 'FD', TARTALOM: 'iák munkaváll. 110120' } - the wrong result
{ NEV: 'F', TARTALOM: 'Diák munkaváll. 110120' } - i think it should be like this

Maybe the tabs or extra spaces in the field value cause the problem?
The console log results

Képernyőkép 2022-05-31 152034

Problems to add new records using dbf.appendRecords(records)

Hi guys, I'ave problema to insert a new register to DBF file, mi error is the next:

Error: SALESDATE: expected a date

my code:
let records = [
{
CID: 'JN0LZGRL',
CM: '01',
SALESDATE: '15/03/23',
CCOD_DOC: '07',
CSER: '00B070',
CNUM: '0000000000241',
}
]

Help me pls

dbf-file.js appendRecordsToDBF sets value to empty space for DateTime field

I notice this line of code which cause issue for us when trying to insert a row into a dbf with empty datetime field. utils_2.formatVfpDateTime throws exception saying getTime is not a function which is correct when the datetime value is artificially set to empty space.

Can you consider making a code change to skip the datetime field when value is null? Thanks.

 // Write the records.
        for (let i = 0; i < records.length; ++i) {
            // Write one record.
            let record = records[i];
            validateRecord(dbf.fields, record);
            let offset = 0;
            buffer.writeUInt8(0x20, offset++); // Record deleted flag
            // Write each field in the record.
            for (let j = 0; j < dbf.fields.length; ++j) {
                // Get the field's value.
                let field = dbf.fields[j];
                let value = record[field.name];
                if (value === null || typeof value === 'undefined')
                    value = '';
                let encoding = getEncodingForField(field, dbf._encoding);
                // Encode the field in the buffer, according to its type.
                switch (field.type) {
                    case 'C': // Text
                        let b = iconv.encode(value, encoding);
                        for (let k = 0; k < field.size; ++k) {
                            let byte = k < b.length ? b[k] : 0x20;
                            buffer.writeUInt8(byte, offset++);
                        }
                        break;
                    case 'N': // Number
                    case 'F': // Float - appears to be treated identically to Number
                        value = value.toString();
                        value = value.slice(0, field.size);
                        while (value.length < field.size)
                            value = ' ' + value;
                        iconv.encode(value, encoding).copy(buffer, offset, 0, field.size);
                        offset += field.size;
                        break;
                    case 'L': // Boolean
                        buffer.writeUInt8(value ? 0x54 /* 'T' */ : 0x46 /* 'F' */, offset++);
                        break;
                    case 'T': // DateTime
                        const { julianDay, msSinceMidnight } = utils_2.formatVfpDateTime(value);

Show the fieldName when validation failed

Would help alot if the validation can show which field failed validation.

See below a small change I made to the code.

function validateRecord(fields, record) {
    for (let i = 0; i < fields.length; ++i) {
        let name = fields[i].name, type = fields[i].type;
        let value = record[name];
        // Always allow null values
        if (value === null || typeof value === 'undefined')
            continue;
        // Perform type-specific checks
        if (type === 'C') {
            if (typeof value !== 'string')
                throw new Error('Expected a string for ' + name);
            if (value.length > 255)
                throw new Error('Text is too long (maximum length is 255 chars) for ' + name);
        }
        else if (type === 'N' || type === 'F' || type === 'I') {
            if (typeof value !== 'number')
                throw new Error('Expected a number for ' + name + ' but has ', value);
        }
        else if (type === 'D') {
            if (!(value instanceof Date))
                throw new Error('Expected a date for ' + name);
        }
    }
}

Support for concurrent read/ writes

Hi thank you for the great work!

I'm fairly new to DBF files, and unfortunately I can't find much documentation online about DBF specifications.

Does this package (or DBF files themselves) come in support for concurrent read/ writes?

For context, I'm currently working on an API integration for a library system where they use DBF files on a network drive as their database. When reading the DBF files, will it cause issues to the existing terminals (responsible for checking in and out books)?

Thank you once again.

Error: Type 'F' is not supported

I have a DBF that has some float data in it, getting "Error: Type 'F' is not supported" when reading the data. I was curious if float data is on the roadmap of supported file types? I looked at the code and it looks as if I could add another case statement for it, but I am not familiar with any of the gotchas I may run into during the process? Can you help? @yortus

Getting error (node:7468) UnhandledPromiseRejectionWarning: Error: Type '0' is not supported at Object.validateFieldDescriptor (C:\chatbot\node_modules\dbffile\dist\field-descriptor.js:16:15) at openDBF (C:\chatbot\node_modules\dbffile\dist\dbf-file.js:82:32)

As I am currently using big dbf file so I am getting error as

(node:7468) UnhandledPromiseRejectionWarning: Error: Type '0' is not supported
at Object.validateFieldDescriptor (C:\chatbot\node_modules\dbffile\dist\field-descriptor.js:16:15)
at openDBF (C:\chatbot\node_modules\dbffile\dist\dbf-file.js:82:32)

Why there is type 0??

Invalid DBF: Incorrect record length problem

Invalid DBF: Incorrect record length

Can I know what this error message mean?

I guess difference between length attributes and record length or another.
But I couldn't figure out, so can I get answer?

Search DBF by key

Hello.

I'm trying to get a specific record from the database, not the whole file. Is it possible to get, for example, column NAME whose ID = 10657? I say, something "SQL style". I am receiving a person's ID, and I want to know his card ID.

Thank you very much in advance.

Reading a very large file

Hi @yortus

I was having trouble reading for a large file.
dbf.readRecords(dbf.recordCount) results in a out of memory exception,
could you suggest a way to iterate through the whole file in small chunks

save file error

I get this error when I try to save a file.
An error occurred: TypeError: fs.openAsync is not a function
Here is my code

 DBFFile.create(`Archivo - ${moment().format(format)}.dbf`, fieldDescriptors)
          .then(dbf => {
              console.log('DBF file created.');
              return dbf.append(rows);
          })
          .then(() => console.log(rows.length + ' rows added.'))
          .catch(err => console.log('An error occurred: ' + err));

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.