GithubHelp home page GithubHelp logo

flipkart / aesop Goto Github PK

View Code? Open in Web Editor NEW
23.0 23.0 14.0 71.33 MB

A keen Observer of changes that can also relay change events reliably to interested parties. Provides useful infrastructure for building Eventually Consistent data sources and systems.

Java 70.75% Shell 0.20% JavaScript 27.84% CSS 0.57% FreeMarker 0.64%

aesop's People

Contributors

aryaketan avatar aymandf avatar dkulkarni avatar jagadeesh-huliyar avatar phantomastray avatar pratyay-banerjee avatar raviknits avatar regunathb avatar rishabhdua-zz avatar shoury avatar shubhamfk avatar sneha29shukla avatar yogeshdfk avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

aesop's Issues

how to set the max

there is a debug info when i run mysql-blocking-bootstrap server, how to configure this resource poll max number?

10:14:33.918 [pool-8-thread-1] DEBUG c.m.v.resourcepool.BasicResourcePool.prelimCheckoutResource 554 c.m.v.resourcepool.BasicResourcePool - acquire test -- pool is already maxed out. [managed: 15; max: 15]

Error while setting up sample-memory-relay on Windows

I have been following the instrutctions provided on this :

https://github.com/Flipkart/aesop/wiki/Relay-Examples

to setup a new memory-relay server. I have follwed the exact same instrutcion set but when I start my service using:

java -cp "./target/sample-memory-relay-1.2.1-SNAPSHOT.jar:./target/lib/*" org

.trpr.platform.runtime.impl.bootstrap.BootstrapLauncher ./src/main/resources/e
xternal/bootstrap.xml

I keep getting this exception in my logs:

16:09:34.185 [main] WARN o.eclipse.jetty.webapp.WebAppContext - Failed startup
of context o.e.j.w.WebAppContext{/,null},file:\C:\aesop\samples\sample-memory-re
lay\target\lib\runtime-relay-1.2.1-SNAPSHOT.jar!
java.io.FileNotFoundException: file:\C:\aesop\samples\sample-memory-relay\target
\lib\runtime-relay-1.2.1-SNAPSHOT.jar!
at org.eclipse.jetty.webapp.WebInfConfiguration.unpack(WebInfConfigurati
on.java:495) ~[jetty-webapp-8.1.5.v20120716.jar:8.1.5.v20120716]
at org.eclipse.jetty.webapp.WebInfConfiguration.preConfigure(WebInfConfi
guration.java:64) ~[jetty-webapp-8.1.5.v20120716.jar:8.1.5.v20120716]
at org.eclipse.jetty.webapp.WebAppContext.preConfigure(WebAppContext.jav
a:438) ~[jetty-webapp-8.1.5.v20120716.jar:8.1.5.v20120716]
at org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:474
) ~[jetty-webapp-8.1.5.v20120716.jar:8.1.5.v20120716]
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLife
Cycle.java:59) [jetty-util-8.1.5.v20120716.jar:8.1.5.v20120716]
at org.eclipse.jetty.server.handler.HandlerCollection.doStart(HandlerCol
lection.java:224) [jetty-server-8.1.5.v20120716.jar:8.1.5.v20120716]
at org.eclipse.jetty.server.handler.ContextHandlerCollection.doStart(Con
textHandlerCollection.java:167) [jetty-server-8.1.5.v20120716.jar:8.1.5.v2012071
6]
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLife
Cycle.java:59) [jetty-util-8.1.5.v20120716.jar:8.1.5.v20120716]
at org.eclipse.jetty.server.handler.HandlerCollection.doStart(HandlerCol
lection.java:224) [jetty-server-8.1.5.v20120716.jar:8.1.5.v20120716]
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLife
Cycle.java:59) [jetty-util-8.1.5.v20120716.jar:8.1.5.v20120716]
at org.eclipse.jetty.server.handler.HandlerWrapper.doStart(HandlerWrappe
r.java:90) [jetty-server-8.1.5.v20120716.jar:8.1.5.v20120716]
at org.eclipse.jetty.server.Server.doStart(Server.java:272) [jetty-serve
r-8.1.5.v20120716.jar:8.1.5.v20120716]
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLife
Cycle.java:59) [jetty-util-8.1.5.v20120716.jar:8.1.5.v20120716]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.
0_45]
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:1.8.
0_45]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:
1.8.0_45]
at java.lang.reflect.Method.invoke(Unknown Source) ~[na:1.8.0_45]
at org.springframework.beans.factory.support.AbstractAutowireCapableBean
Factory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1638) [sp
ring-beans-3.2.5.RELEASE.jar:3.2.5.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBean
Factory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1579) [spring-
beans-3.2.5.RELEASE.jar:3.2.5.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBean
Factory.initializeBean(AbstractAutowireCapableBeanFactory.java:1509) [spring-bea
ns-3.2.5.RELEASE.jar:3.2.5.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBean
Factory.doCreateBean(AbstractAutowireCapableBeanFactory.java:521) [spring-beans-
3.2.5.RELEASE.jar:3.2.5.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBean
Factory.createBean(AbstractAutowireCapableBeanFactory.java:458) [spring-beans-3.
2.5.RELEASE.jar:3.2.5.RELEASE]
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getOb
ject(AbstractBeanFactory.java:296) [spring-beans-3.2.5.RELEASE.jar:3.2.5.RELEASE
]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistr
y.getSingleton(DefaultSingletonBeanRegistry.java:223) [spring-beans-3.2.5.RELEAS
E.jar:3.2.5.RELEASE]
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBe
an(AbstractBeanFactory.java:293) [spring-beans-3.2.5.RELEASE.jar:3.2.5.RELEASE]
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean
(AbstractBeanFactory.java:194) [spring-beans-3.2.5.RELEASE.jar:3.2.5.RELEASE]
at org.springframework.beans.factory.support.DefaultListableBeanFactory.
preInstantiateSingletons(DefaultListableBeanFactory.java:628) [spring-beans-3.2.
5.RELEASE.jar:3.2.5.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.finish
BeanFactoryInitialization(AbstractApplicationContext.java:932) [spring-context-3
.2.5.RELEASE.jar:3.2.5.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.refres
h(AbstractApplicationContext.java:479) [spring-context-3.2.5.RELEASE.jar:3.2.5.R
ELEASE]
at org.springframework.context.support.ClassPathXmlApplicationContext.(ClassPathXmlApplicationContext.java:139) [spring-context-3.2.5.RELEASE.jar:
3.2.5.RELEASE]
at org.springframework.context.support.ClassPathXmlApplicationContext.(ClassPathXmlApplicationContext.java:105) [spring-context-3.2.5.RELEASE.jar:
3.2.5.RELEASE]
at com.flipkart.aesop.runtime.spring.RuntimeComponentContainer.init(Runt
imeComponentContainer.java:151) [runtime-1.2.1-SNAPSHOT.jar:na]
at org.trpr.platform.runtime.impl.container.spring.SpringContainerImpl.i
nit(SpringContainerImpl.java:102) [runtime-core-1.3.1.jar:na]
at org.trpr.platform.runtime.impl.bootstrap.spring.Bootstrap.start(Boots
trap.java:248) [runtime-core-1.3.1.jar:na]
at org.trpr.platform.runtime.impl.bootstrap.spring.Bootstrap.init(Bootst
rap.java:135) [runtime-core-1.3.1.jar:na]
at org.trpr.platform.runtime.impl.bootstrap.BootstrapLauncher.main(Boots

Even though my server logs say that the app has started, but when I hit localhost:9090, I get a 503.

Since I am using windows, I have been exceuting these commands on gitbash.

Implement HBase Bootstrap data store

The default implementation uses MySQL and the data stored is opaque as an Avro serialized blob object. Implement a HBase based bootstrap that provides high write throughput and reads - range scans for SCNs.

Unable to run elasticsearch client

Hello,
I am running mysql-relay sample and elasticsearch together in the machine. For the first time it ran successfully, but didnot insert anything in ES cluster. When I restarted ES client, the errors I get are explained below:

Once I get this:


Trooper __
**/ \ Runtime Nature : SERVER
__/ **/ Component Container : com.flipkart.aesop.runtime.spring.ClientRuntimeComponentContainer
/ __/ \ Startup Time : 2,839 ms
__/ __/ Host Name: localhost
__/


The first ERROR i come across from the logs is:

16:30:48.943 [io337177125-2] ERROR     c.l.d.c.n.AbstractNettyHttpConnection$BaseHttpResponseProcessor.startResponse 754 c.l.d.c.n.NettyHttpDatabusRelayConnection - server error detected class=com.linkedin.databus.core.ScnNotFoundException message=No message provided
16:30:48.943 [io337177125-1] ERROR c.l.d.c.n.AbstractNettyHttpConnection$BaseHttpResponseProcessor.startResponse 754 c.l.d.c.n.NettyHttpDatabusRelayConnection - server error detected class=com.linkedin.databus.core.ScnNotFoundException message=No message provided

Then, I see this few lines later:

16:30:48.944 [pz_elastic_search_cluster-0-RelayPuller] INFO  c.l.databus.client.RelayPullThread.doPickRelay 405 sample_bookings_pz_elastic_search_cluster-0 - picked a relay:[server=DatabusServerCoordinates [_name=localhost, _address=localhost/127.0.0.1:25021, _state=ONLINE], subs=[[ps=[uri=databus:physical-source:ANY;role=ANY;rk=], pp=*:*, ls=[name=com.flipkart.aesop.events.sample.bookings]]]]
16:30:48.944 [pz_elastic_search_cluster-1-RelayPuller] INFO  c.l.databus.client.RelayPullThread.doPickRelay 405 sample_bookings_pz_elastic_search_cluster-1 - picked a relay:[server=DatabusServerCoordinates [_name=localhost, _address=localhost/127.0.0.1:25021, _state=ONLINE], subs=[[ps=[uri=databus:physical-source:ANY;role=ANY;rk=], pp=*:*, ls=[name=com.flipkart.aesop.events.sample.bookings]]]]
16:30:48.944 [pz_elastic_search_cluster-0-RelayPuller] INFO  c.l.d.c.n.AbstractNettyHttpConnection.close 141 c.l.d.c.n.NettyHttpDatabusRelayConnection - closing connection to: localhost/127.0.0.1:25021
16:30:48.944 [pz_elastic_search_cluster-1-RelayPuller] INFO  c.l.d.c.n.AbstractNettyHttpConnection.close 141 c.l.d.c.n.NettyHttpDatabusRelayConnection - closing connection to: localhost/127.0.0.1:25021
16:30:48.946 [pz_elastic_search_cluster-1-RelayPuller] INFO  c.l.databus.core.DbusPrettyLogUtils.logExceptionAtInfo 70 c.l.d.c.n.GenericHttpResponseHandler - channel to peer closed: localhost/127.0.0.1:25021
16:30:48.946 [pz_elastic_search_cluster-0-RelayPuller] INFO  c.l.databus.core.DbusPrettyLogUtils.logExceptionAtInfo 70 c.l.d.c.n.GenericHttpResponseHandler - channel to peer closed: localhost/127.0.0.1:25021
16:30:48.947 [pz_elastic_search_cluster-1-RelayPuller] ERROR c.l.databus.core.DbusPrettyLogUtils.logExceptionAtError 170 c.l.d.c.n.GenericHttpResponseHandler - <1414808544_WAIT_FOR_CHUNK>got closed channel while waiting for response
16:30:48.947 [pz_elastic_search_cluster-0-RelayPuller] ERROR c.l.databus.core.DbusPrettyLogUtils.logExceptionAtError 170 c.l.d.c.n.GenericHttpResponseHandler - <1837525615_WAIT_FOR_CHUNK>got closed channel while waiting for response
16:30:48.947 [pz_elastic_search_cluster-1-RelayPuller] ERROR c.l.databus.core.DbusPrettyLogUtils.logExceptionAtError 174 c.l.d.c.n.StreamHttpResponseProcessor - Exception during /stream response: . Exception message = java.nio.channels.ClosedChannelException. Exception cause = null
16:30:48.947 [pz_elastic_search_cluster-0-RelayPuller] ERROR c.l.databus.core.DbusPrettyLogUtils.logExceptionAtError 174 c.l.d.c.n.StreamHttpResponseProcessor - Exception during /stream response: . Exception message = java.nio.channels.ClosedChannelException. Exception cause = null

Overall I see the same pattern. ScnNotFoundException followed by Closed channel. Everything is running locally in same machine, both MySQL relay and ElasticSearch client. The port 25021 is listening when I grepped using netstat.

differences and use cases between bootstrap and blocking bootstrap

under the runtimes directory,there is
-runtime-blocking-bootstrap(based on databus-bootstrap-server)
-runtime-bootstrap(based on databus-bootstrap-server)
-runtime-client-bootstrap-producer
and the sample directory contains
-sample-bootstrap-server
-sample-client-bootstrap-producer
-sample-mysql-mysql-blocking-bootstrap

the only difference between these following two is serialization of change event
-runtime-blocking-bootstrap(based on databus-bootstrap-server)
-runtime-bootstrap(based on databus-bootstrap-server)

When the system is being set up for the first time there will be a lot of data that has been accumulated since day one. This data needs to be transformed and stored in destination data store the way it would be done for events in real time.

However there is an issue in reusing the usual relay and client. Since the throughput of the client would be dependant on that of the destination data store, the clients would fall off the Relay. The buffer in the Relay does not help as the Relay throughput is far higher than that of Client and hence the buffer fills up and starts overwriting even before the client has been able to pull that event. 

as the proposal in issue #43

1.Enabled bin log on original db.
2.Take a dump from original db.
3.Create a new db with bin log enabled.
4.Import the dump to this newly created db.
5.Make this newly created db a slave to original db.
6.Start bootstrap on newly created db, and wait for catch up.
7.After catchup stop boot strap.
8.Start relayer consumer on original db to start from catch location.

we use
-sample-mysql-mysql-blocking-bootstrap
to realize the first 7 steps and we use for the last two steps
-sample-mysql-relay
-sample-kafka-client-cluster-consumer

the question is when and how to use

-sample-bootstrap-server
-sample-client-bootstrap-producer

.how do relay know the buffer is full or the incoming event will overwrite and push these ones to a bootstrap server,how do client know fall off the relay and start client-bootstrap-producer to get long look-back updates from the bootstrap server.

right now our mysql does not enable bin log feature.if i want to use aesop to get all the change data nearly real time, i need to enable bin log first.
i can use a blocking bootstrap to restore the existing data and use normal relay-consumer method to get the following change data.

@aryaKetan @dhirendrafkl @regunathb

error in blocking bootstrap mode

I setup a local mysql to get a remote server`s data ,but occasionally we encounter this following error.
first i stop the remote mysql server from writing data ,the dump the whole databse ,change the ini setting,then restart the remote server.
then i start local server and my blocking-bootstrap
from i know ,binlog event are in serial, in theory binlog record the data committed succussfully there should not exist any foreign key constraints problem.


11:32:22.726 [pool-33-thread-1] ERROR o.t.p.c.impl.logging.SLF4jLogWrapper.error 56 c.f.a.r.b.c.SourceEventProcessor - Exception occurred while processing event PreparedStatementCallback; SQL [INSERT INTO medicalrecord.medicalrecordattachment (AttachType,CreateTime,FileName,SmallImgUrl,ReportID,AttachUrl,BigImgUrl,HospitalCode,ReportPath,FileID,UploadTime,ID,AttachSize,PrintStatus,MRIID,HospitalName,ReportStatus,IsValid) VALUES(?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)  ON DUPLICATE KEY UPDATE AttachType=?,CreateTime=?,FileName=?,SmallImgUrl=?,ReportID=?,AttachUrl=?,BigImgUrl=?,HospitalCode=?,ReportPath=?,FileID=?,UploadTime=?,AttachSize=?,PrintStatus=?,MRIID=?,HospitalName=?,ReportStatus=?,IsValid=?]; Cannot add or update a child row: a foreign key constraint fails (`medicalrecord`.`medicalrecordattachment`, CONSTRAINT `FK_Reference_5` FOREIGN KEY (`MRIID`) REFERENCES `medicalrecordinfo` (`ID`)); nested exception is com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Cannot add or update a child row: a foreign key constraint fails (`medicalrecord`.`medicalrecordattachment`, CONSTRAINT `FK_Reference_5` FOREIGN KEY (`MRIID`) REFERENCES `medicalrecordinfo` (`ID`))
org.springframework.dao.DataIntegrityViolationException: PreparedStatementCallback; SQL [INSERT INTO medicalrecord.medicalrecordattachment (AttachType,CreateTime,FileName,SmallImgUrl,ReportID,AttachUrl,BigImgUrl,HospitalCode,ReportPath,FileID,UploadTime,ID,AttachSize,PrintStatus,MRIID,HospitalName,ReportStatus,IsValid) VALUES(?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)  ON DUPLICATE KEY UPDATE AttachType=?,CreateTime=?,FileName=?,SmallImgUrl=?,ReportID=?,AttachUrl=?,BigImgUrl=?,HospitalCode=?,ReportPath=?,FileID=?,UploadTime=?,AttachSize=?,PrintStatus=?,MRIID=?,HospitalName=?,ReportStatus=?,IsValid=?]; Cannot add or update a child row: a foreign key constraint fails (`medicalrecord`.`medicalrecordattachment`, CONSTRAINT `FK_Reference_5` FOREIGN KEY (`MRIID`) REFERENCES `medicalrecordinfo` (`ID`)); nested exception is com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Cannot add or update a child row: a foreign key constraint fails (`medicalrecord`.`medicalrecordattachment`, CONSTRAINT `FK_Reference_5` FOREIGN KEY (`MRIID`) REFERENCES `medicalrecordinfo` (`ID`))
    at org.springframework.jdbc.support.SQLErrorCodeSQLExceptionTranslator.doTranslate(SQLErrorCodeSQLExceptionTranslator.java:249) ~[spring-jdbc-3.2.5.RELEASE.jar:3.2.5.RELEASE]
    at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:72) ~[spring-jdbc-3.2.5.RELEASE.jar:3.2.5.RELEASE]
    at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:605) ~[spring-jdbc-3.2.5.RELEASE.jar:3.2.5.RELEASE]
    at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:818) ~[spring-jdbc-3.2.5.RELEASE.jar:3.2.5.RELEASE]
    at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:840) ~[spring-jdbc-3.2.5.RELEASE.jar:3.2.5.RELEASE]
    at org.springframework.jdbc.core.namedparam.NamedParameterJdbcTemplate.update(NamedParameterJdbcTemplate.java:281) ~[spring-jdbc-3.2.5.RELEASE.jar:3.2.5.RELEASE]
    at org.springframework.jdbc.core.namedparam.NamedParameterJdbcTemplate.update(NamedParameterJdbcTemplate.java:285) ~[spring-jdbc-3.2.5.RELEASE.jar:3.2.5.RELEASE]
    at com.flipkart.aesop.mysqldatalayer.upsert.MySQLUpsertDataLayer.upsert(MySQLUpsertDataLayer.java:58) ~[data-layer-mysql-1.2.1-SNAPSHOT.jar:na]
    at com.flipkart.aesop.destinationoperation.UpsertDestinationStoreProcessor.processDestinationEvent(UpsertDestinationStoreProcessor.java:35) ~[client-event-consumer-1.2.1-SNAPSHOT.jar:na]
    at com.flipkart.aesop.eventconsumer.implementation.DefaultEventConsumerImpl.processSourceEvent(DefaultEventConsumerImpl.java:141) ~[client-event-consumer-1.2.1-SNAPSHOT.jar:na]
    at com.flipkart.aesop.runtime.bootstrap.consumer.SourceEventProcessor.process(SourceEventProcessor.java:53) [runtime-blocking-bootstrap-1.2.1-SNAPSHOT.jar:na]
    at com.flipkart.aesop.runtime.bootstrap.consumer.SourceEventProcessor.run(SourceEventProcessor.java:48) [runtime-blocking-bootstrap-1.2.1-SNAPSHOT.jar:na]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_51]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_51]
    at java.lang.Thread.run(Thread.java:745) [na:1.8.0_51]
Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Cannot add or update a child row: a foreign key constraint fails (`medicalrecord`.`medicalrecordattachment`, CONSTRAINT `FK_Reference_5` FOREIGN KEY (`MRIID`) REFERENCES `medicalrecordinfo` (`ID`))
    at sun.reflect.GeneratedConstructorAccessor31.newInstance(Unknown Source) ~[na:na]
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.8.0_51]
    at java.lang.reflect.Constructor.newInstance(Constructor.java:422) ~[na:1.8.0_51]
    at com.mysql.jdbc.Util.handleNewInstance(Util.java:389) ~[mysql-connector-java-5.1.35.jar:5.1.35]
    at com.mysql.jdbc.Util.getInstance(Util.java:372) ~[mysql-connector-java-5.1.35.jar:5.1.35]
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:973) ~[mysql-connector-java-5.1.35.jar:5.1.35]
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3835) ~[mysql-connector-java-5.1.35.jar:5.1.35]
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3771) ~[mysql-connector-java-5.1.35.jar:5.1.35]
    at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2435) ~[mysql-connector-java-5.1.35.jar:5.1.35]
    at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2582) ~[mysql-connector-java-5.1.35.jar:5.1.35]
    at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2535) ~[mysql-connector-java-5.1.35.jar:5.1.35]
    at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1911) ~[mysql-connector-java-5.1.35.jar:5.1.35]
    at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2145) ~[mysql-connector-java-5.1.35.jar:5.1.35]
    at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2081) ~[mysql-connector-java-5.1.35.jar:5.1.35]
    at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2066) ~[mysql-connector-java-5.1.35.jar:5.1.35]
    at com.mchange.v2.c3p0.impl.NewProxyPreparedStatement.executeUpdate(NewProxyPreparedStatement.java:105) ~[c3p0-0.9.1.jar:0.9.1]
    at org.springframework.jdbc.core.JdbcTemplate$2.doInPreparedStatement(JdbcTemplate.java:824) ~[spring-jdbc-3.2.5.RELEASE.jar:3.2.5.RELEASE]
    at org.springframework.jdbc.core.JdbcTemplate$2.doInPreparedStatement(JdbcTemplate.java:818) ~[spring-jdbc-3.2.5.RELEASE.jar:3.2.5.RELEASE]
    at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:589) ~[spring-jdbc-3.2.5.RELEASE.jar:3.2.5.RELEASE]
    ... 12 common frames omitted

how to recreate the original database

if i deploy a relay to mysql,and successfully get the cdc event message in kafka topics,how to recreate the original mysql database from these messages ,is there a complete example to follow?
thx

Hbase Producer Event Skip Issue

In case of high concurrency, events were getting skipped from aesop consumer, however same event is being replicated to aesop producer from hbase.

binary type seems not supported by avro

Hi,

I'm using your avro schema tool to generate avro schema file, and using linkedin databus as code base to build relay, client and bootstap binaries to run. And I get error when testing mysql varbinary column. Here is the stack trace. So how do you process binary type when loading avro schema?

org.apache.avro.SchemaParseException: Undefined name: "binary"
at org.apache.avro.Schema.parse(Schema.java:935)
at org.apache.avro.Schema.parse(Schema.java:1042)
at org.apache.avro.Schema.parse(Schema.java:977)
at org.apache.avro.Schema.parse(Schema.java:880)
at com.linkedin.databus2.schemas.FileSystemVersionedSchemaSetProvider.loadSchemas(FileSystemVersionedSchemaSetProvider.java:105)
at com.linkedin.databus2.schemas.FileSystemVersionedSchemaSetProvider.loadSchemas(FileSystemVersionedSchemaSetProvider.java:91)
at com.linkedin.databus2.schemas.FileSystemVersionedSchemaSetProvider.loadSchemas(FileSystemVersionedSchemaSetProvider.java:70)
at com.linkedin.databus2.schemas.FileSystemSchemaRegistryService.refreshSchemaSet(FileSystemSchemaRegistryService.java:156)
at com.linkedin.databus2.schemas.FileSystemSchemaRegistryService.initializeSchemaSet(FileSystemSchemaRegistryService.java:135)
at com.linkedin.databus2.schemas.FileSystemSchemaRegistryService.build(FileSystemSchemaRegistryService.java:57)
at com.linkedin.databus2.schemas.StandardSchemaRegistryFactory.createSchemaRegistry(StandardSchemaRegistryFactory.java:50)
at com.linkedin.databus.container.netty.HttpRelay.(HttpRelay.java:132)
at com.linkedin.databus2.relay.DatabusRelayMain.(DatabusRelayMain.java:100)
at com.linkedin.databus.relay.example.PersonRelayServer.main(PersonRelayServer.java:74)

IllegalArgumentException with mysql-relayer

Hi I am getting this exception

18:58:48.220 [binlog-parser-1] ERROR c.g.c.o.OpenReplicator$ORBinlogParserListener.onException 103 c.g.code.or.common.util.QueryUtil - Exception occured in binlogParser java.lang.IllegalArgumentException: invalid value: 16777215 at com.google.code.or.common.glossary.column.Int24Column.valueOf(Int24Column.java:70) ~[open-replicator-1.0.8.jar:na] at com.google.code.or.binlog.impl.parser.AbstractRowEventParser.parseRow(AbstractRowEventParser.java:152) ~[open-replicator-1.0.8.jar:na] at com.google.code.or.binlog.impl.parser.WriteRowsEventV2Parser.parseRows(WriteRowsEventV2Parser.java:75) ~[open-replicator-1.0.8.jar:na] at com.google.code.or.binlog.impl.parser.WriteRowsEventV2Parser.parse(WriteRowsEventV2Parser.java:64) ~[open-replicator-1.0.8.jar:na] at com.google.code.or.binlog.impl.ReplicationBasedBinlogParser.doParse(ReplicationBasedBinlogParser.java:147) ~[open-replicator-1.0.8.jar:na] at com.google.code.or.binlog.impl.AbstractBinlogParser$Task.run(AbstractBinlogParser.java:308) ~[open-replicator-1.0.8.jar:na] at java.lang.Thread.run(Thread.java:745) [na:1.8.0_05] 18:58:48.221 [binlog-parser-1] INFO c.g.c.o.OpenReplicator$ORBinlogParserListener.onException 104 c.g.code.or.common.util.QueryUtil - Retrying the prassing from : mysql-bin.000017:794805

Can anyone help me what this error is and how to solve this issue?

Cannot see messages being pushed to kafka via kafka client

I am following the steps to setup the kafka client : https://github.com/Flipkart/aesop/wiki/Client-Examples
When i run the example, i see this message in my console window:
18:29:59.248 [fk_kafka_cluster-1-RelayPuller] ERROR c.l.databus.client.RelayPullThread.doSourcesResponseSuccess 537 ortest_Person_fk_kafka_cluster-1 - Source not found on server: com.flipkart.aesop.events.ortest.Person
18:29:59.248 [fk_kafka_cluster-0-RelayPuller] ERROR c.l.databus.client.RelayPullThread.doSourcesResponseSuccess 537 ortest_Person_fk_kafka_cluster-0 - Source not found on server: com.flipkart.aesop.events.ortest.Person
I havent tweaked the sample as such..Any pointers will be appreciated much...The mysql-server-relay is running properly though...

how to express index in one table

if we have a table like this
CREATE TABLE IF NOT EXISTS person (
ID varchar(36) NOT NULL ,
Code varchar(50) NOT NULL,
PRIMARY KEY (ID),
KEY index_name (Code)
) ENGINE=InnoDB DEFAULT CHARSET=utf8

we can define a application.conf in blocking bootstrap server

MYSQL_CONFIG
{
    or_test:
    {
        "Person"=[
        {
            "destinationNamespace":"or_test",
            "destinationEntity":"PERSONPARALLEL",
            "columnMap": {
                "ID":"ID",
                "Code":"Code"
            },
            "primaryKeyList": [ "pid" ]
        }
        ]
    }
 }

how to express the " KEY index_name (Code)" part in table definition

how to configure mysql datalayer in blocking bootstrap sample

i want to run the bootstrap sample sample-mysql-mysql-blocking-bootstrap
but nothing happens.
it seems the destination mysql conf in line 40 "https://github.com/Flipkart/aesop/blob/master/samples/sample-mysql-mysql-blocking-bootstrap/src/main/resources/external/spring-blocking-bootstrap-config.xml"

mysql://or_test%2For_test@localhost:3306/3306/mysql-bin

and

MYSQL_CONFIG
{
    or_test:
    {
        "Person"=[
        {
            "destinationNamespace":"or_test",
            "destinationEntity":"PERSONPARALLEL",
            "columnMap": {
                "id":"pid",
                "firstName":"firstname",
                "lastName":"lastname",
                "birthDate":"birthdate",
                "deleted":"deleted"
            },
            "primaryKeyList": [ "pid" ]
        }
        ]
    }
 }

does this mean I can get two tables in database or_test one is the original "Person" and the new one "PERSONPARALLEL"
@regunathb

mysql connection uri

I have a very silly question. The connection to mysql is done using an url in the spring-relay-config.xml. The uri is mysql://or_test%2For_test@localhost:3306/12345/mysql-bin.

Ofcourse this works, but I am confused where this 12345 comes from in the uri?

Issue with example and mvn install

When I try to go through the Aesop Examples, I have the following error during the mvn command :

screenshot

It seems that we have unresolved dependancy in the pom.xml fil for linkedin/databus binaries.

Damien

Adding new Avro schema from mysql-Elasticsearch

I have created a new Avro schema for a particular new Database and Table I had created. My goal is to transfer changes from mysql to Elasticsearch for this new table I created.

I have put the schema in MySql relay distribution/schemas_registry. Mysql relay works fine. But, in the spring-client-config.xml of ES, it is not picking up the new schema.

I get this error:

14:03:49.057 [pz_elastic_search_cluster-1-RelayDispatcher] ERROR c.l.databus.client.GenericDispatcher.doCheckStartSource 527 es_test_bookings_pz_elastic_search_cluster-1 - Unable to find schema: srcid=2001 name=com.flipkart.aesop.events.es_test.bookings schemaId=055f1eedd345aa57fd55e44583260210

My schema is: com.flipkart.aesop.events.es_test.bookings
Can you help me in this? Do ask me if you want more questions.

Adding a new mapper type isn't possible.

While we want to add new Mapper that would do mapping based on some conditional logic we found out that due to MapperType being an Enum, it isn't possible to customize it to add new mapper.

Implement mastership change aware SCN generation logic in MySQL producer

The current implementation for SCN in MySQL producer will not survive mastership change automatically. SCN numbers will need to be updated on producer and all consumers. Need an implementation that can accommodate mastership change as a monotonically increasing generation change that will then be used to generate the SCN. Needs to use a central coordination service like Zookeeper.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.