GithubHelp home page GithubHelp logo

mongoosastic / mongoosastic Goto Github PK

View Code? Open in Web Editor NEW
1.1K 27.0 332.0 1.72 MB

Index Mongoose models into elasticsearch automatically.

Home Page: https://mongoosastic.github.io/mongoosastic/

License: MIT License

JavaScript 1.33% CSS 0.08% Shell 0.16% Pug 0.44% TypeScript 97.99%
elasticsearch elasticsearch-queries mongoose javascript elastic mongodb nodejs

mongoosastic's Introduction

Mongoosastic

CI workflow NPM version Coverage Status Downloads

Mongoosastic is a mongoose plugin that can automatically index your models into elasticsearch.

Getting started

  1. Install the package
npm install mongoosastic
  1. Setup your mongoose model to use the plugin
const mongoose     = require('mongoose')
const mongoosastic = require('mongoosastic')
const Schema       = mongoose.Schema

var User = new Schema({
    name: String,
    email: String,
    city: String
})

User.plugin(mongoosastic)
  1. Query your Elasticsearch with the search() method (added by the plugin)
const results = await User.search({
  query_string: {
    query: "john"
  }
});

NOTE: You can also query Elasticsearch with any other method. Example:

curl http://localhost:9200/users/user/_search

Documentation

View docs

mongoosastic's People

Contributors

albanm avatar bobrown101 avatar callumgare avatar chapel avatar derekdomino avatar enrichz avatar francesconero avatar gazsp avatar greenkeeper[bot] avatar guumaster avatar isayme avatar jamescarr avatar jasonmore avatar jeresig avatar jonjburgess avatar kyleamathews avatar mahnunchik avatar ngmitam avatar nlko avatar phillro avatar renovate-bot avatar renovate[bot] avatar samypesse avatar sascha avatar srfrnk avatar stickycube avatar sukrubezen avatar taterbase avatar yahiakr avatar yoitsro avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mongoosastic's Issues

Settings ignored when creating index

In the function createMappingIfNotPresent() the settings are ignored.
I'm not familiar with the ElasticSearch API but I think we should call putSettings() after having created the index.

highlight does not work

I find highlight does not work and when I dive into the code I saw:

var model = this
      , esQuery = {
        body: {query: query},
        index: options.index || indexName,
        type:  options.type  || typeName
      }

    Object.keys(options).forEach(function(opt) {
      if (!opt.match(/hydrate/) && options.hasOwnProperty(opt))
        esQuery[opt] = options[opt]
    })

It seems no way to set highlight parameters to the body

Incorrect Mapping Created for Nested Types

taterbase#79 (comment)

Trying to use ElasticSearch's Facet Terms capability. To do so, I need to set one of my fields to use the "keyword" analyzer instead of the default.

Here is my code:

var chatSchema = new Schema({
        cluster: String,
        time: Number,
        room: {type:String, es_indexed:true, es_analyzer:'keyword'},
        user: {type:String, es_indexed:true},
        message: {type:String, es_indexed:true}
})

var clusterSchema = new Schema({
    name: String,
    info:{
        chatCount: Number,
        start: Number,
        end: Number,
        rooms: Array,
        senders: Array
    },
    chat: { type:[chatSchema], es_indexed:true }
})

clusterSchema.plugin(mongoosastic);

The "analyzer" attribute is not created or passed correctly to ElasticSearch. Here is what is created in ElasticSearch:

mappings: {
    cluster: {
        properties: {
            chat: {
                properties: {
                    message: {
                        type: string
                    }
                    time: {
                        type: long
                    }
                    _id: {
                        type: string
                    }
                    __v: {
                        type: long
                    }
                    cluster: {
                        type: string
                    }
                    user: {
                        type: string
                    }
                    room: {
                        type: string
                    }
                }
            }
            info: {
                properties: {
                    senders: {
                        type: string
                    }
                    start: {
                        type: long
                    }
                    chatCount: {
                        type: long
                    }
                    end: {
                        type: long
                    }
                    rooms: {
                        type: string
                    }
                }
            }
        }
    }
}

A few things to notice:

  • Although I haven't marked the multi-field "info" to be mapped, it is.
  • The "analyzer" attribute is not present on the room field in the chatSchema.
  • The cluster and time fields for the chatSchema (although not marked to be indexed) are being indexed (as well as the _id and __v that I'm assuming is used/managed by Mongoose/MongoDB)

Using Version: 0.2.6

Failed to do nested query

Hi, I have the following schema with mongoose:

var Comment = new Schema({
    title: String
  , body: String
  , author: String
})


var User = new Schema({
    name: {type:String, es_indexed:true}
  , email: String
  , city: String
  , comments: {type:[Comment], es_indexed:true}
})

And no custom mapping to es.

Now I want to search all the users who has the comments with body contains the word 'like', I tried the following:

Blog.search({
    nested: {
      path: 'comments',
      query: {
        bool: {
          must: [
            {
              match : {
                "comments.body" : "like"
              }
            }
          ]
        }
      }
    }
  })

But I get some error like: nested object under path [comments] is not of nested type

Do you know how to fix it? Sorry that I am a newbie in es and I searched around but get no helpful information, thanks!

Cannot find a way to paginate results

I think the documentation lack of something...
I'm trying to follow the original ElasticSearch properties to insert size and from but without success...

Actually I'm trying to perform a full text search on my users and the controller receive as body the entire string:

User.search({
    query_string: {
      query: req.body.search
    }
},
{ hydrate: true },
function(err, results) {
    users: results.hits.hits
});

Where should I define from and size to handle a pagination?

{ [Error: ETIMEDOUT] code: 'ETIMEDOUT' }

taterbase#129 (comment)

A time after I run model.synchronize(); I get the following messages.

{ [Error: ETIMEDOUT] code: 'ETIMEDOUT' }
{ [Error: ETIMEDOUT] code: 'ETIMEDOUT' }
{ [Error: ETIMEDOUT] code: 'ETIMEDOUT' }
{ [Error: ETIMEDOUT] code: 'ETIMEDOUT' }
{ [Error: ETIMEDOUT] code: 'ETIMEDOUT' }
{ [Error: ETIMEDOUT] code: 'ETIMEDOUT' }
{ [Error: ETIMEDOUT] code: 'ETIMEDOUT' }
{ [Error: ETIMEDOUT] code: 'ETIMEDOUT' }
{ [Error: ETIMEDOUT] code: 'ETIMEDOUT' }
{ [Error: ETIMEDOUT] code: 'ETIMEDOUT' }

And so on.

I have 250k+ docs in mongodb, and I want index 2 fields.

I have lastest versions of slasticsearch, mongoosastic, mongoose, and centos6 64bits.

Search is not restricted to a specific model/index

When searching through a model as ModelA.search({/* query */}, cb) without having previously issued a save or remove on the same model, the queries are performed globally, and not on the index associated with the model.

Referencing to another index in elasticsearch

I am using Elasticsearch for searching the indexed documents. The documents are indexed using mongoosastic. How could i refer to another index in a query like populate in mongoose or join in SQL. Please help me with an example.

i am having two index namely project and user. In project collection of mongodb, refers to user collection of mongodb. I am searching on project using elasticsearch and getting the results. But i want to refer user index in elasticsearch in same query fired on project index. So how could i refer user index from project index.

Searching for nested models

taterbase#67 (comment)

I would like to be able to search for nested models. For example, in this schema:

var Comment = new Schema({
    title: String
  , body: String
  , author: String
})


var User = new Schema({
    name: {type:String, es_indexed:true}
  , email: String
  , city: String
  , comments: {type:[Comment], es_indexed:true}
})

I would like to search for comments that match a certain text, for users in a certain city. I would like the query to return only the comments that match the query.

My questions are:

  1. Is there a way to store the comments in a separate index, where the user is the parent? (while still using mongoosastic and not a custom solution)
  2. Is there another way to search for the comments, without the need to filter the results after they are returned?

Fix travis tests

Find a workaround the timeout need in test/bulk-test.js for waiting index deletion.

mapping: property with ObjectId type will neglect other es_ properties

taterbase#130 (comment)

For example, I have a schema, and it has user property, like:

user: {
        type: Schema.Types.ObjectId,
        ref: 'User',
        es_indexed: true,
        es_type: 'string',
        es_index_analyzer: 'keyword',
        es_cast: function(value){
            if (value._id) {
                return value.id;
            } else {
                return value.toString();
            }
        }
    }

Because I don't want to analyze the user property, and add es_index_analyzer: 'keyword', but I found out that when mongoosastic meet ObjectId type, it will neglect other es_ setting, and that may not meet my need.

esTruncate does no work

When i try to truncate a model indeces, i got this error:
[TypeError: Unable to build a path with those params. Supply at least index]
Do you guys have any idea?

Error when using complex sort object

Hi,

First, thank you for your plugin, it is very useful !

When using a complex sort object to execute a query, there is an exception.

Here is a test to reproduce it from search-features-test.js:

 it('should be able to sort by name in descending order', function(done) {
       Bond.search({
         match_all: {}
       }, {
         sort: [{
           name: { order: 'desc' }
         }]
       }, function(err, res) {
         var expected = ['Legal', 'Construction', 'Commercial', 'Bail'];
         res.hits.total.should.eql(4);
         res.hits.hits.forEach(function(bond, index) {
           expected[index].should.eql(bond._source.name);
         });
         done();
       });
});

The error is:

Message:
     Cannot read property 'hits' of undefined
 Details:
     domain: [object Object]
     domainThrown: true
 Stack:
 TypeError: Cannot read property 'hits' of undefined
     at /dev/sandbox/mongoosastic/test/search-features-test.js:88:12
     at /dev/sandbox/mongoosastic/lib/mongoosastic.js:250:16
     at respond (/dev/sandbox/mongoosastic/node_modules/elasticsearch/src/lib/transport.js:254:9)
     at checkRespForFailure (/dev/sandbox/mongoosastic/node_modules/elasticsearch/src/lib/transport.js:203:7)
     at HttpConnector.<anonymous> (/dev/sandbox/mongoosastic/node_modules/elasticsearch/src/lib/connectors/http.js:156:7)
     at IncomingMessage.wrapper (/dev/sandbox/mongoosastic/node_modules/elasticsearch/node_modules/lodash/index.js:3181:19)
     at IncomingMessage.emit (events.js:129:20)
     at _stream_readable.js:908:16
     at process._tickDomainCallback (node.js:381:11)

A possible workaround is the one below but I am not sure if this is a good solution: the sort option is placed into the body of the ES query when it is not a string (see my comments in the code).

   schema.statics.search = function(query, options, cb) {
     if (arguments.length === 2) {
       cb = arguments[1];
       options = {};
     }

     options.hydrateOptions = options.hydrateOptions || defaultHydrateOptions || {};

     if (query === null)
       query = undefined;

     var _this = this,
       sort = options.sort,  // i add this variable here
       esQuery = {
         body: {
           query: query
         },
         index: options.index || indexName,
         type: options.type || typeName
       };
     if (options.highlight) {
       esQuery.body.highlight = options.highlight;
     }

     // put the sort option in the body of the query when this is no a string
     if (sort && typeof (sort) !== 'string') {
       esQuery.body.sort = options.sort;
     }

     Object.keys(options).forEach(function(opt) {
       // keep the old way when the sort is a string
       if (!opt.match(/hydrate/) && (!opt.match(/sort/) || typeof (sort) === 'string') && options.hasOwnProperty(opt))
         esQuery[opt] = options[opt];
     });

Can't even use Mongoosastic

Hi!

I'm trying to add mongoostatic plugin to schema but It throws the follow error:

node_modules/mongoosastic/lib/mapping-generator.js:122
        if (paths[field].schema && paths[field].schema.tree && paths[field].sc
                        ^
TypeError: Cannot read property 'schema' of undefined
    at getCleanTree (/home/gsaloma/graco/node_modules/mongoosastic/lib/mapping-
    at getCleanTree (/home/gsaloma/graco/node_modules/mongoosastic/lib/mapping-
    at Generator.generateMapping (/home/gsaloma/graco/node_modules/mongoosastic
    at getMapping (/home/gsaloma/graco/node_modules/mongoosastic/lib/mongoosast
    at Mongoosastic (/home/gsaloma/graco/node_modules/mongoosastic/lib/mongoosa
    at Schema.plugin (/home/gsaloma/graco/node_modules/mongoose/lib/schema.js:5
    at Object.<anonymous> (/home/gsaloma/graco/modules/schema/oportunidad.js:12
    at Module._compile (module.js:456:26)
    at Object.Module._extensions..js (module.js:474:10)
    at Module.load (module.js:356:32)

Am I doing something wrong?

Cheers, Jorge

The new document can not be searched after it accepts es-indexed event

Here is my case:

  1. Add new doc and make it listen on es-indexed event to make sure the index is built in es
  2. Then search it immediately but the search result does not contain the new one

I can get the new one if I search it after 1-2 seconds later. So, I guess if the es-indexed event does not make sure building index has been completed?

Nested schema with geolocation and _id issue

Hi, I am new to Mongoosastic and elastic search. Just want to understand the following:

var companySchema = new Schema ({
    name: {type:String,required:true, es_indexed:true},
    address :[{line1: {type:String,required:true},
                line2: String,
                city: {type:String,required:true,es_indexed:true},
                state: {type:String,required:true,es_indexed:true},
                country: {type: String,required:true, default: 'India'},
                zip: {type:String,required:true,es_indexed:true},
                location: {
                    geo_point: {
                        type: String,
                        es_type: 'geo_point',
                        es_lat_lon: true,
                        es_indexed: true,
                        es_format: "compressed",
                        es_precision: "1km"
                    },
                    lat: { type: Number},
                    lon: { type: Number}
                },
                registration_location: {
                    lat: { type: Number},
                    lon: { type: Number}
                },
                isVerificationNeeded: {type:Boolean,required:true, default: false},
                isVerified: Boolean,
                isActive: {type: Boolean,required:true, default: true}
            }],
    categoriesDealIn: [{type: Schema.Types.ObjectId,required:true, ref: 'Category',es_indexed:true}],
    classificationCategories: [{type: String,required:true, ref: 'ClassificationCategory',es_indexed:true}]
});

this generates

"mappings" : {
      "company" : {
        "properties" : {
          "address" : {
            "properties" : {
              "city" : {
                "type" : "string"
              },
              "location" : {
                "properties" : {
                  "lat" : {
                    "type" : "long"
                  },
                  "lon" : {
                    "type" : "long"
                  }
                }
              },
              "state" : {
                "type" : "string"
              },
              "zip" : {
                "type" : "string"
              }
            }
          },
          "categoriesDealIn" : {
            "type" : "string"
          },
          "classificationCategories" : {
            "type" : "string"
          },
          "name" : {
            "type" : "string"
          },
        }
    }
}

where as we need

{
  "instafind" : {
    "aliases" : { },
    "mappings" : {
      "company" : {
        "properties" : {
          "address" : {
            "type" : "nested",
            "properties" : {
              "city" : {
                "type" : "string"
              },
              "location" : {
                "type" : "geo_point",
                "fielddata" : {
                  "precision" : "1km",
                  "format" : "compressed"
                }
              },
              "state" : {
                "type" : "string"
              },
              "zip" : {
                "type" : "string"
              }
            }
          },
          "categoriesDealIn" : {
            "type" : "string"
          },
          "classificationCategories" : {
            "type" : "string"
          },
          "name" : {
            "type" : "string",
            "index_analyzer" : "autocomplete",
            "search_analyzer" : "standard"
          },
        }
      }
    },
    "settings" : {
      "index" : {
        "creation_date" : "1426258942277",
        "uuid" : "7Y6j_XgeSau2BdCFSuU6jQ",
        "analysis" : {
          "analyzer" : {
            "autocomplete" : {
              "type" : "custom",
              "filter" : [ "lowercase", "autocomplete_filter" ],
              "tokenizer" : "standard"
            }
          },
          "filter" : {
            "autocomplete_filter" : {
              "min_gram" : "2",
              "type" : "edge_ngram",
              "max_gram" : "20"
            }
          }
        },
        "number_of_replicas" : "1",
        "number_of_shards" : "1",
        "version" : {
          "created" : "1040499"
        }
      }
    },
    "warmers" : { }
  }
}

Can you please elaborate what changes we need to do in our javascript code for the same?

Similarly, we have classification schema:

var classificationSchema = new Schema({
    _id: {type:String, es_indexed:true}, //path ->
    name: {type:String,required:true, es_indexed:true}
});

We want to do autocomplete filter for _id field here too? How can that be achieved?

Thanks!

mapping generator error

taterbase#65 (comment)

Hello there. Thanks for this fantastic project, I noticed a small bug as I updated my schema
mongoosastic fail to generate my schema mapping when my schema has complex attributes like the following:

new Schema({
    recents: {
      list: [{type: ObjectId, ref: 'User'}],
      size: {type: Number}
    }
  });

I don't even need to index this attributes but
createMapping generate the following error

        else if ( paths[field].caster && paths[field].caster.instance ) {
                              ^
TypeError: Cannot read property 'caster' of undefined
  at getCleanTree (/Users/pg/Jogabo/app/node_modules/mongoosastic/lib/mapping-generator.js:122:31)
  at getCleanTree (/Users/pg/Jogabo/app/node_modules/mongoosastic/lib/mapping-generator.js:148:28)
  at Generator.generateMapping (/Users/pg/Jogabo/app/node_modules/mongoosastic/lib/mapping-generator.js:5:19)
  at createMappingIfNotPresent (/Users/pg/Jogabo/app/node_modules/mongoosastic/lib/mongoosastic.js:142:13)
  at Function.module.exports.schema.statics.createMapping (/Users/pg/Jogabo/app/node_modules/mongoosastic/lib/mongoosastic.js:39:5)

What version of Elasticsearch does Mongoosastic support?

I'm currently using Elasticsearch v1.4.2 and Mongoosastic v2.0.6 and getting the error below from Elasticsearch when trying to index a document (I'm assuming this is due to a protocol mismatch between Mongoosastic and Elasticsearch).

I had a quick scan of the read me, but I can't see anything related to which Elasticsearch version is supported.

[2014-12-23 14:15:25,929][WARN ][transport.netty          ] [Gauntlet] exception caught on transport layer [[id: 0xdc9754dd, /127.0.0.1:63167 => /127.0.0.1:9300]], closing connection
java.io.StreamCorruptedException: invalid internal transport message format, got (50,4f,53,54)
    at org.elasticsearch.transport.netty.SizeHeaderFrameDecoder.decode(SizeHeaderFrameDecoder.java:47)
    at org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:425)
    at org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303)
    at org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
    at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
    at org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
    at org.elasticsearch.common.netty.OpenChannelsHandler.handleUpstream(OpenChannelsHandler.java:74)
    at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
    at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559)
    at org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:268)
    at org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:255)
    at org.elasticsearch.common.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88)
    at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108)
    at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:318)
    at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89)
    at org.elasticsearch.common.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)
    at org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
    at org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:744)

Analyzer for a field type is ignored

Hi Guys,

I am trying to define a specific analyzer for some field types however it is being ignored, I am following the convention of prefixing 'es_' to fields types that map to the elasticsearch mapping configuration.

Example:

var MyModel: {
  field: {
    type: string,
   es_indexed: true,
   es_analyzer: 'my_analyzer'
  } 
}

I have also tried prior to running the app, to set-up the index myself manually to make sure my custom analyser is available but still the mappings don't have it.

For now, I am having to manually use the REST API to delete the mapping, map just the the fields I want to set a specific analyzer and then use the mongoosastic to go and save a model via my app to set-up the remaining fields that take on the defaults.

Removing

Hey guys. When we remove a document from our mongo, how do we remove the reference from Elastic as well ? In the official example, there is only a documentation about saving a file, what about deleting one ? Or do we also use doc.on('es-indexed') while removing too (findOneAndRemove).

Thanks in advance

Is the Suggest endpoint exposed?

Hi Guys,

I wanted to ask if the suggest endpoint is exposed in mongoosastic as I would like to make suggest requests to elasticsearch, without the need to run a query.

Is this currently possible? My other option is to use the Javascript Client API directly.

Thanks

search returning results from wrong index

Hi, I need some help.
I have a model defined like this:

var productSchema = new Schema({'master': Array, ...})
productSchema.plugin(mongoosastic);
var Product = mongoose.model('Product', productSchema);

Using the following query, I get results that aren't from the products index:

Product.search({
  'match': {
    'master' : 'somestring' // there are multiple schemas which contain 'master': Array
  }
}, function (err, products) {
      console.log(products.hits.hits)
});

// the output contains hits that have '_index' different than 'products'

What am I doing wrong?

I've hacked around the issue by using a filtered query where the filter checks for a field that only exists for products, but I'm convinced that it's redundant and that I did something wrong in my code.

Error with geo_code with Elasticsearch version 1.3.1

taterbase#114 (comment)

Hi All,
I'm getting this error when I index my models with the new version 1.3.1. Any ideas why?

{ error: 'RemoteTransportException[...][index]]; nested: MapperParsingException[failed to parse]; nested: ElasticsearchParseException[field must be either \'lat\', \'lon\' or \'geohash\']; '

My mapping looks like this:

        location:
          geo_point:
            type       : String
            es_type    : 'geo_point'
            es_lat_lon : true
          lat :
            type: Number
          lon :
            type: Number

Index referenced documents

taterbase#101 (comment)

I'm attempting to index referenced documents. (Without making them sub-documents)

For example;

var Author = mongoose.Schema({
    name: { type: String }
}):

var Page = mongoose.Schema({
    title: { type: String },
    author: { type: ObjectId, ref: 'Author' }
});

After getting indexed this becomes something like this;

"author": {
    _bsontype: "ObjectID",
    id: "SYæ�Sþ\n®àÊF$"
},

Is there anyway to index the full referenced object instead of just the ObjectId?

not_analyzed parameter is not applied when indexing fields

Hi, I am indexing a mongoose model into Elasticsearch by specifying the fields using es_indexed: true, which worked fine.

Now I want some of those fields not to be analyzed by using the index: 'not_analyzed' option, but it doesn't work.

I ran GET /actions/_mapping in Elasticsearch to check the mapping, and the option is not shown. Therefore I guess that mongoosastic is not passing it.

I also tried to delete the index from Elasticsearch to ensure that it takes my edits into consideration, but it didn't change neither.

If I manually enter the index using PUT /actions, then it works, but the goal is to have it in the schema declaration, as below.

var actionSchema = exports.Schema = mongoose.Schema({
  published: {
    type: Date,
    default: Date.now,
    es_indexed: true
  },
  actorUser: {
    type: ObjectId,
    ref: 'User',
    es_indexed: true
  },
  actor: {
    type: activityObject,
    es_indexed: true
  },
  pictureUrl: String,
  verb: {
    type: String,
    es_indexed: true,
    index: 'not_analyzed'
  },
  object: {
    type: activityObject,
    es_indexed: true
  },
  target: {
    id: {
      type: String,
      es_indexed: true
    },
    displayName: {
      type: String,
      es_indexed: true,
      index: 'not_analyzed'
    },
    objectType: {
      type: String,
      es_indexed: true
    },
    image: String,
    url: String
  },
  path: {
    type: String,
    es_indexed: true
  },
  recipients: [{
    type: ObjectId,
    ref: 'User'
  }]
});

Am I doing something wrong ? What should I do to make it work?

Thanks!

has no method 'index'

I am getting an error when creating a new Model

[MODEL] has no method 'index' at model.<anonymous> (/app/node_modules/mongoosastic/lib/mongoosastic.js:350:13) at model.emit (events.js:117:20) at handleSave (/app/node_modules/mongoose/lib/model.js:132:10) at /app/node_modules/mongoose/lib/utils.js:408:16 at /app/node_modules/mongoose/node_modules/mongodb/lib/mongodb/collection/core.js:125:9

The Model that causes the error is called 'base' and doesn't implement the mongoosastic plugin. Its child, which extends the 'base' model (by using baseModel.add({...}) adds the plugin. So somehow, even if I'm not adding it explicitly to the base model it assumes that it is there somehow and fails.

If I can provide any additional information, I would gladly do so. Thanks for looking into this.

Maintainers for this package

This package is apparently not being maintained. Maybe you could give write access to some of the top contributors to move forward this useful tool.

Thanks.

Issue in connecting cloud url

thanks for good lib

i am trying connect by clould url (https://{key_name}:{key}@fili-us-east-1.searchly.com). while creating mapping i am getting follwing error:
Elasticsearch WARNING: 2015-02-24T07:30:22Z
Unable to revive connection: http://{key_name}:{key}@fili-us-east-1.searchly.com:9000/

Elasticsearch WARNING: 2015-02-24T07:30:22Z
No living connections

Below is my code

var url = new URL('https://site:{key_name}:{key}@fili-us-east-1.searchly.com');
console.log(url);
bookSchema.plugin(mongoosastic,{
host:url.host,
port: url.port,

auth: url.auth
// ,curlDebug: true
});

var Book = mongoose.model("Book", bookSchema);

Book.createMapping(function(err, mapping){
if(err){
console.log('error creating mapping (you can safely ignore this)');
console.log(err);
}else{
console.log('mapping created!');
console.log(mapping);
}
});

geo_point not mapping correctly in nested model

I have a nested model:

var officeSchema = new Schema({
    _id: Schema.ObjectId,
    name: String,
    location: {
        geo_point: {
            type: String,
            es_type: 'geo_point',
            es_lat_lon: true
        },
        lat: {type: Number},
        lon: {type: Number}
    }
});

var businessSchema = new Schema({
    _id: Schema.ObjectId,
    name: {type:String, es_indexed:true}
    office: {type:[officeSchema], es_indexed:true}
});

Indexing:

"location": { "lat": 1, "lon": 2}

but the office mapping geo_point result is a string:

 "office": {
      "properties": {
          "location": {
               "properties": {
                   "lat": {
                        "type": "long"
                     },
                     "lon": {
                        "type": "long"
                     }
                 }
            },

Is this a bug?

How to query a post by tag?

This is my tag

{
  id:"xxxxx"
  name:"xxxxx"
}

This is my post

{
    id:"xxxx",
    title:"xxxxx",
    content:"xxxxx",
    tags:[tag_id,....]
}

I want query a post's list by tag's name. Does it support this feature?

TypeError: Undefined type at `_id.auto`

Upon upgrading from 0.0.11 -> 0.5.0 our schema definitions started throwing this error..

/Users/.../node_modules/mongoose/lib/schema.js:362
    throw new TypeError('Undefined type at `' + path +
          ^
TypeError: Undefined type at `_id.auto`
  Did you try nesting Schemas? You can only nest using refs or arrays.
    at Function.Schema.interpretAsType (/Users/.../node_modules/mongoose/lib/schema.js:362:11)
    at Schema.path (/Users/.../node_modules/mongoose/lib/schema.js:305:29)
    at Schema.add (/Users/.../node_modules/mongoose/lib/schema.js:217:12)
    at Schema.add (/Users/.../node_modules/mongoose/lib/schema.js:212:14)
    at new Schema (/Users/.../node_modules/mongoose/lib/schema.js:79:10)
    at Object.<anonymous> (/Users/.../models/logs/segment.js:7:21)
    at Module._compile (module.js:456:26)
    at Object.Module._extensions..js (module.js:474:10)
    at Module.load (module.js:356:32)
    at Function.Module._load (module.js:312:12)

Here is the schema definition

var SegmentSchema = new Schema({
  name: String,
  date: { type: Date, default: Date.now },
  likelyhood: Number,
  pathId: Schema.Types.ObjectId,
}, {
  db: 'logs'
});

Again, this doesn't have an issue under 0.0.11, but does once upgraded to 0.5.0.

The end goal would be to catch up completely (and the same error presents itself under 2.0.6).

.findOneAndUpdate .update not reindexing

as mongoose doesn't populate when using above mentioned methods, I assume mongoosastic doesn't either. Which raises the question how I can manually force an index'ing?

geo_point may be badly mapped when used in conjuction with es_indexed

taterbase#86 (comment)

Hi,

I spent a couple hours to figure out why geo_point types were mapped to double or strings...

I'm using the 0.2.6 version.

I resolved my trouble by calling the createMapping function. In my opinion the sentence about the ongoing dev of mapping is not clear (shall we use it or not ?)

In fact I did some tests without calling createMapping:

  • if I do not use es_indexed:true the result mapping uses the geo_point types (so it's fine)
  • if I do use es_indexed:true the result mapping is wrong and I must call createMapping to make it good.

Sorry it is out my understanding why it behaves like that, I just wanted to mentioned it even if I understood now that - being able to avoid calling createMapping - is an ongoing dev.

Let me know if I can be of any help.

By the way, thanks a lot for this awesome module,

Nicolas

For references I used the following mapping:

        geo: {
            type: String,
            es_type: 'geo_point',
            es_indexed:true
        },
        coord: {
            geo_point: {
              type: String,
              es_type: 'geo_point',
              es_lat_lon: true,
              es_indexed:true
            },
            lat: { type: Number },
            lon: { type: Number }
        },

The mapping result was

    "properties" : {
      "coord" : {
        "properties" : {
          "lat" : {
            "type" : "double"
          },
          "lon" : {
            "type" : "double"
          }
        }
      },
      
      "geo" : {
        "type" : "string"
      }
    }

Filtering option for save hook

Hi,
I am wondering if middlewarehook for saving to mongoose can be triggered for specific type of items instead of all of them.

For example:

Item: {
type: es_index:true
size: es_index:false
}

I can get all the items with type index in elasticsearch but what I want is to have only let's say items whose size are bigger than 20 to be indexed into elasticsearch.

Is the a way to do that?

New npm version

The repo is tagged with new version 2.0.7, and it is ready to publish. Could you please publish it, or give access to collaborators to publish?

Docs on npm access is here, ex:

npm owner add guumaster mongoosastic

Singleton or dependency injection pattern

Why don't you use a singleton or dependency injection pattern? For each schema I'm creating a new elasticsearch client. Though elasticsearch isn't a persistent connection I find it odd that each time I use this plugin on a schema I'm creating an entirely new elasticsearch client.

I basically have to load my config file, grab the connection info out, and pass it as an option for each schema. Why not just have mongoosastic.connect("host", port), call it once and reuse that same elasticsearch client. You've gone to the extent of using the nop module, yet you initiate a new client for each schema. It's not terribly inefficient, but its definitely not convenient. As I typically only have a single elasticsearch cluster for an entire project, but I might have multiple types/indices.

This approach would be similar to how mongoose does it themselves. Is there any reason why this hasn't been done? You also don't allow passing multiple hosts. Is there any reason for this?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.