GithubHelp home page GithubHelp logo

spring-projects / spring-integration-extensions Goto Github PK

View Code? Open in Web Editor NEW
277.0 49.0 265.0 3.6 MB

The Spring Integration Extensions project provides extension components for Spring Integration

Home Page: http://www.springintegration.org/

HTML 0.60% CSS 4.28% Java 90.08% XQuery 0.04% Groovy 5.00%

spring-integration-extensions's Introduction

Spring Integration Extensions

The Spring Integration Extensions project provides extension modules for Spring Integration. This project is part of the SpringSource organization on GitHub.

Available Modules

Samples

Under the samples directory, you will find samples for the various modules. Please refer to the documentation of each sample for further details.

Getting support

Check out the spring-integration tag on Stack Overflow.

These extensions are community-supported projects and, unlike Spring Integation itself, they are not released on a regular schedule. If you have specific requests about an extension, open a GitHub issue for consideration. Contributions are always welcome.

Related GitHub projects

Issue Tracking

Report issues via the Spring Integration Extensions JIRA.

Building from source

Each module of the Spring Integration Extensions project is hosted as independent project with its own release cycle. For the build process of individual modules we recommend using a Gradle-based build system modelled after the Spring Integration project. Also, the Spring Integration Adapter Template for SpringSource Tool Suite (STS) provides a Gradle-based build system. For more information, please see How to Create New Components.

Therefore, the following build instructions should generally apply for most, if not all, Spring Integration Extensions. In the instructions below, ./gradlew is invoked from the root of the source tree and serves as a cross-platform, self-contained bootstrap mechanism for the build. The only prerequisites are Git and JDK 1.6+.

Check out the sources

git clone git://github.com/spring-projects/spring-integration-extensions.git

Go into the directory of a specific module

cd module-name

Compile and test, build all jars

./gradlew build

Install the modules jars into your local Maven cache

./gradlew install

... and discover more commands with ./gradlew tasks. See also the Gradle build and release FAQ.

Import sources into your IDE

Using Eclipse / STS

When using SpringSource Tool Suite you can directly import Gradle based projects:

File -> Import -> Gradle Project

Just make sure that the Gradle Support for STS is installed. Alternatively, you can also generate the Eclipse metadata (.classpath and .project files) using Gradle:

./gradlew eclipse

Once complete, you may then import the projects into Eclipse as usual:

File -> Import -> Existing projects into workspace

Using IntelliJ IDEA

To generate IDEA metadata (.iml and .ipr files), do the following:

./gradlew idea

Contributing

Pull requests are welcome. Please see the contributor guidelines for details. Additionally, if you are contributing, we recommend following the process for Spring Integration as outlined in the administrator guidelines.

Creating Custom Adapters

In order to simplify the process of writing custom components for Spring Integration, we provide a Template project for SpringSource Tool Suite (STS) version 3.0.0 and greater. This template is part of the [Spring Integation Templates][] project. For more information please read How to Create New Components.

Staying in touch

Follow the Spring Integration team members and contributors on Twitter:

License

The Spring Integration Extensions Framework is released under version 2.0 of the Apache License unless noted differently for individual extension Modules, but this should be the rare exception.

We look forward to your contributions!!

spring-integration-extensions's People

Contributors

akryvtsun avatar al81-ru avatar amolnayak311 avatar artembilan avatar beardy247 avatar bellwethr avatar dependabot[bot] avatar edgedalmacio avatar erenavsarogullari avatar garyrussell avatar ghillert avatar gregbragg avatar idueppe avatar ilayaperumalg avatar jagedn avatar joensson avatar kcrimson avatar leejianwei avatar lukasz-antoniak avatar markfisher avatar olamy avatar otnateos avatar prafsoni avatar robharrop avatar sobychacko avatar spring-builds avatar spring-operator avatar tomevers avatar tysewyn avatar viniciusccarvalho avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

spring-integration-extensions's Issues

Can you please try an implementation of Kryo Serializer

We are using hazelcast as an standalone server and want to make it unaware of Java objects on web server. We had used a Json envelope for this but it would be a cool feature if you can figure out a Kryo envelope out of box. Thanks really great work..

Question - Java-Dsl

Hi,

Can we configure a RequestHandlerRetryAdvice using the Java DSL api?

Thanks

smb MessageHandler not correctly serializing user/password

Hi Folks,

I've just tried to use the spring integration smb extension to push files according to this post.

@Bean
public MessageHandler smbMessageHandler(SmbSessionFactory smbSessionFactory) {
    FileTransferringMessageHandler<SmbFile> handler =
                new FileTransferringMessageHandler<>(smbSessionFactory);
    handler.setRemoteDirectoryExpression(
                new LiteralExpression("remote-target-dir"));
    handler.setFileNameGenerator(m ->
                m.getHeaders().get(FileHeaders.FILENAME, String.class) + ".test");
    handler.setAutoCreateDirectory(true);
    return handler;
}

I do fill in the user/password/doamin and when I launch the code, I get an error

Caused by: org.springframework.core.NestedIOException: Unable to initialize share: smb://user%3Apassword@hostname:445/test-seb; nested exception is jcifs.smb.SmbAuthException: Logon failure: unknown user name or bad password.

Please do note the user%3Apassword , this comes from here

URLEncoder.encode(domainUserPass, "UTF8"); is basically encoding the : which is wrong.
correct URL scheme should be smb://user:password@host:445 instead of smb://user%3Apassword@host:445

question on SplunkEvent

We are trying to use this extension (via Spring XD) and found that we need to always the data as key value pair. In our use case we need to send json data splunk and we can not have plain key value pair. Is there any plan to create an alternative SplunkEvent which do not need a key. (without addPair functionality).

Kafka - incorrect handling of null messages?

I'm using the following configuration to receive messages on the kafkaChannel bean. Unfortunately, when I call kafkaChannel.receive(), I receive Message objects even if the topic is new, and nothing has been published. (The payload returned is a HashMap)

<int:channel id="kafkaChannel">
    <int:queue />
</int:channel>

<int-kafka:inbound-channel-adapter id="kafkaInboundChannelAdapter"
                                   kafka-consumer-context-ref="consumerContext"
                                   auto-startup="true"
                                   channel="kafkaChannel">
    <int:poller fixed-delay="10" time-unit="MILLISECONDS" max-messages-per-poll="5"/>
</int-kafka:inbound-channel-adapter>

<int-kafka:zookeeper-connect id="zookeeperConnect" zk-connect="localhost:9092" />

<int-kafka:consumer-context id="consumerContext" zookeeper-connect="zookeeperConnect">
    <int-kafka:consumer-configurations>
        <int-kafka:consumer-configuration group-id="default">
            <int-kafka:topic id="Test_Topic" streams="6"/>
        </int-kafka:consumer-configuration>
    </int-kafka:consumer-configurations>
</int-kafka:consumer-context>

Have I done something bone-headed with the configuration above? The documentation isn't terribly clear.

I believe there is a bug in the receive method of KafkaConsumerContext. The return value should only return a Message if none has been received.

return consumedData.isEmpty() ? null : MessageBuilder.withPayload(consumedData).build();

CacheListeningPolicyType.ALL not working when using ITopic

Messages published to an ITopic are being received by local hazelcast member only, all remote members receive the message but they discard it even when CacheListeningPolicyType.ALL is set by the HazelcastEventDrivenMessageProducer

HazelcastMessageListener is ignoring the currently set CacheListeningPolicyType and just calls sendMessage with null here:

Any reason why this shouldn't pass the CacheListeningPolicyType?

spring-integration-hazelcast requires JVM local Hazelcast instance

Hi,

I tried to use the LeaderElection functionality for Hazelcast and stumpled upon the fact that org.springframework.integration.hazelcast.HazelcastLocalInstanceRegistrar is checking com.hazelcast.core.Hazelcast for all HazelcastInstances on that JVM. The thing is that this doesn't include HazelcastClients, but in our Scenario we have dedicated Hazelcast server to which we connect with HazelcastClients.

I lack the deeper knowledge of the spring-integration Project, as I'm just after the LeaderElection, but wouldn't it be sufficient to just have a HazelcastInstance autowired, which is then used to register the Listener on to?

For the moment I worked around the issue by tricking Spring into thinking that this Registrar is already present by adding a Dummy bean, which seems to do no harm for the LeaderElection scenario but likely for the other things.

    @Bean( name = "org.springframework.integration.hazelcast.HazelcastLocalInstanceRegistrar" )
    public String HazelcastLocalInstanceRegistrar()
    {
        return "";
    }

spring-integration-hazelcast LeaderInitiator shutting if error is catched during event propagation

Hi,

I have the following scenario, where an exception is thrown during the LeaderEvent propagation. It seems like it's actually making it's way into the org.springframework.integration.hazelcast.leader.LeaderInitiator class, where it's not catched and even worse the error is swallowed and neither causes the app to die. This will leave you with an broken instance without further details to why it failed.

Not sure if I'm doing something wrong in my listener or if the LeaderInitiator should be more safeguarded around the LeaderEvent propagations.

Do we have spring-integration-smb release version

do we have release the spring integration smb version or is there any plan to release it in near future. with snapshot version, we are not able to download it and we are planning to have released version as we are moving our code to production environment.

Java DSL: Assertion error "headerExpressions must not be empty"

If we don't add a headerExpression on the EnricherSpec class (because we simply only want to enrich the payload), we will get an assertion error in the ContentEnricher class saying

headerExpressions must not be empty

The is also the same when we only want to use header expressions.

Issues with leader election if Hazelcast cluster nodes failed

Our application is connected to a separate hazelcast connection to the cluster as clients. We experienced outages to some of the hazelcast nodes leading to no leader elected.

Scenario:

  • Our Client Nodes( our service) connects to hazelcast cluster (server)
  • Cluster has one leader, state of Leader Election is fine
  • Some/all server nodes fail
  • After service recovery no instance is Leader, Hazelcast is available (unlocked)

I've investigated the LeaderInitiator and found a couple of issues with the current implementation.

  1. the node being elected leader will never again check if it's still leader due to
Thread.sleep( Long.MAX_VALUE );

after the client node became leader. This can be problematic if the cluster lost any information about the lock or someone force_unlocked the lock. This then likely results in regranting the lock to another node (having 2 leaders). It might be better to regularly check if the cluster is still hold by the node and if not fire the onRevoked event and try to lock again.

  1. Client Nodes not being elected leader are exiting the tryLock cycle if the server node which received the lock request is dying. This effectively blocks the client node from becoming Leader ever again until restart. The exception thrown in such a disconnect event is:
ClientInvocation{clientMessageType=1800, target=partition 258, sendConnection=ClientConnection{alive=false, connectionId=3, socketChannel=DefaultSocketChannelWrapper{socketChannel=java.nio.channels.SocketChannel[closed]}, remoteEndpoint=[10.100.72.10]:5701, lastReadTime=2018-01-22 14:50:08.804, lastWriteTime=2018-01-22 14:50:08.811, closedTime=2018-01-22 14:50:08.804, lastHeartbeatRequested=2018-01-22 14:06:36.623, lastHeartbeatReceived=2018-01-22 14:06:36.625, connected server version=3.8.6}} timed out by 2486800 ms",
      "message": "ClientInvocation{clientMessageType=1800, target=partition 258, sendConnection=ClientConnection{alive=false, connectionId=3, socketChannel=DefaultSocketChannelWrapper{socketChannel=java.nio.channels.SocketChannel[closed]}, remoteEndpoint=[10.100.72.10]:5701, lastReadTime=2018-01-22 14:50:08.804, lastWriteTime=2018-01-22 14:50:08.811, closedTime=2018-01-22 14:50:08.804, lastHeartbeatRequested=2018-01-22 14:06:36.623, lastHeartbeatReceived=2018-01-22 14:06:36.625, connected server version=3.8.6}} timed out by 2486800 ms",
      "name": "com.hazelcast.core.OperationTimeoutException",
      "cause": {
        "commonElementCount": 0,
        "localizedMessage": "Connection closed by the other side",
        "message": "Connection closed by the other side",
        "name": "com.hazelcast.spi.exception.TargetDisconnectedException",
        "cause": {
          "commonElementCount": 0,
          "localizedMessage": "Remote socket closed!",
          "message": "Remote socket closed!",
          "name": "java.io.EOFException",
          "extendedStackTrace": [
            {
              "class": "com.hazelcast.internal.networking.nonblocking.NonBlockingSocketReader",
              "method": "handle",
              "file": "NonBlockingSocketReader.java",
              "line": 153,
              "exact": false,
              "location": "hazelcast-3.8.6.jar!/",
              "version": "3.8.6"
            },
            {
              "class": "com.hazelcast.internal.networking.nonblocking.NonBlockingIOThread",
              "method": "handleSelectionKey",
              "file": "NonBlockingIOThread.java",
              "line": 349,
              "exact": false,
              "location": "hazelcast-3.8.6.jar!/",
              "version": "3.8.6"
            },
            {
              "class": "com.hazelcast.internal.networking.nonblocking.NonBlockingIOThread",
              "method": "handleSelectionKeys",
              "file": "NonBlockingIOThread.java",
              "line": 334,
              "exact": false,
              "location": "hazelcast-3.8.6.jar!/",
              "version": "3.8.6"
            },
            {
              "class": "com.hazelcast.internal.networking.nonblocking.NonBlockingIOThread",
              "method": "selectLoop",
              "file": "NonBlockingIOThread.java",
              "line": 252,
              "exact": false,
              "location": "hazelcast-3.8.6.jar!/",
              "version": "3.8.6"
            },
            {
              "class": "com.hazelcast.internal.networking.nonblocking.NonBlockingIOThread",
              "method": "run",
              "file": "NonBlockingIOThread.java",
              "line": 205,
              "exact": false,
              "location": "hazelcast-3.8.6.jar!/",
              "version": "3.8.6"
            }
          ]
        },
        "extendedStackTrace": [
          {
            "class": "com.hazelcast.client.spi.impl.ClientInvocationServiceSupport$CleanResourcesTask",
            "method": "notifyException",
            "file": "ClientInvocationServiceSupport.java",
            "line": 229,
            "exact": false,
            "location": "hazelcast-client-3.8.6.jar!/",
            "version": "3.8.6"
          },
          {
            "class": "com.hazelcast.client.spi.impl.ClientInvocationServiceSupport$CleanResourcesTask",
            "method": "run",
            "file": "ClientInvocationServiceSupport.java",
            "line": 214,
            "exact": false,
            "location": "hazelcast-client-3.8.6.jar!/",
            "version": "3.8.6"
          },
          {
            "class": "java.util.concurrent.Executors$RunnableAdapter",
            "method": "call",
            "file": "Executors.java",
            "line": 511,
            "exact": false,
            "location": "?",
            "version": "1.8.0_112"
          },
          {
            "class": "java.util.concurrent.FutureTask",
            "method": "runAndReset",
            "file": "FutureTask.java",
            "line": 308,
            "exact": false,
            "location": "?",
            "version": "1.8.0_112"
          },
          {
            "class": "java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask",
            "method": "access$301",
            "file": "ScheduledThreadPoolExecutor.java",
            "line": 180,
            "exact": false,
            "location": "?",
            "version": "1.8.0_112"
          },
          {
            "class": "java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask",
            "method": "run",
            "file": "ScheduledThreadPoolExecutor.java",
            "line": 294,
            "exact": false,
            "location": "?",
            "version": "1.8.0_112"
          },
          {
            "class": "com.hazelcast.util.executor.LoggingScheduledExecutor$LoggingDelegatingFuture",
            "method": "run",
            "file": "LoggingScheduledExecutor.java",
            "line": 140,
            "exact": false,
            "location": "hazelcast-3.8.6.jar!/",
            "version": "3.8.6"
          },
          {
            "class": "java.util.concurrent.ThreadPoolExecutor",
            "method": "runWorker",
            "file": "ThreadPoolExecutor.java",
            "line": 1142,
            "exact": false,
            "location": "?",
            "version": "1.8.0_112"
          },
          {
            "class": "java.util.concurrent.ThreadPoolExecutor$Worker",
            "method": "run",
            "file": "ThreadPoolExecutor.java",
            "line": 617,
            "exact": false,
            "location": "?",
            "version": "1.8.0_112"
          },
          {
            "class": "java.lang.Thread",
            "method": "run",
            "file": "Thread.java",
            "line": 745,
            "exact": false,
            "location": "?",
            "version": "1.8.0_112"
          },
          {
            "class": "com.hazelcast.util.executor.HazelcastManagedThread",
            "method": "executeRun",
            "file": "HazelcastManagedThread.java",
            "line": 64,
            "exact": false,
            "location": "hazelcast-3.8.6.jar!/",
            "version": "3.8.6"
          },
          {
            "class": "com.hazelcast.util.executor.HazelcastManagedThread",
            "method": "run",
            "file": "HazelcastManagedThread.java",
            "line": 80,
            "exact": false,
            "location": "hazelcast-3.8.6.jar!/",
            "version": "3.8.6"
          }
        ]
      },
      "extendedStackTrace": [
        {
          "class": "com.hazelcast.client.spi.impl.ClientInvocation",
          "method": "notifyException",
          "file": "ClientInvocation.java",
          "line": 203,
          "exact": false,
          "location": "hazelcast-client-3.8.6.jar!/",
          "version": "3.8.6"
        },
        {
          "class": "com.hazelcast.client.spi.impl.ClientInvocationServiceSupport$CleanResourcesTask",
          "method": "notifyException",
          "file": "ClientInvocationServiceSupport.java",
          "line": 234,
          "exact": false,
          "location": "hazelcast-client-3.8.6.jar!/",
          "version": "3.8.6"
        },
        {
          "class": "com.hazelcast.client.spi.impl.ClientInvocationServiceSupport$CleanResourcesTask",
          "method": "run",
          "file": "ClientInvocationServiceSupport.java",
          "line": 214,
          "exact": false,
          "location": "hazelcast-client-3.8.6.jar!/",
          "version": "3.8.6"
        },
        {
          "class": "java.util.concurrent.Executors$RunnableAdapter",
          "method": "call",
          "file": "Executors.java",
          "line": 511,
          "exact": false,
          "location": "?",
          "version": "1.8.0_112"
        },
        {
          "class": "java.util.concurrent.FutureTask",
          "method": "runAndReset",
          "file": "FutureTask.java",
          "line": 308,
          "exact": false,
          "location": "?",
          "version": "1.8.0_112"
        },
        {
          "class": "java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask",
          "method": "access$301",
          "file": "ScheduledThreadPoolExecutor.java",
          "line": 180,
          "exact": false,
          "location": "?",
          "version": "1.8.0_112"
        },
        {
          "class": "java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask",
          "method": "run",
          "file": "ScheduledThreadPoolExecutor.java",
          "line": 294,
          "exact": false,
          "location": "?",
          "version": "1.8.0_112"
        },
        {
          "class": "com.hazelcast.util.executor.LoggingScheduledExecutor$LoggingDelegatingFuture",
          "method": "run",
          "file": "LoggingScheduledExecutor.java",
          "line": 140,
          "exact": false,
          "location": "hazelcast-3.8.6.jar!/",
          "version": "3.8.6"
        },
        {
          "class": "java.util.concurrent.ThreadPoolExecutor",
          "method": "runWorker",
          "file": "ThreadPoolExecutor.java",
          "line": 1142,
          "exact": false,
          "location": "?",
          "version": "1.8.0_112"
        },
        {
          "class": "java.util.concurrent.ThreadPoolExecutor$Worker",
          "method": "run",
          "file": "ThreadPoolExecutor.java",
          "line": 617,
          "exact": false,
          "location": "?",
          "version": "1.8.0_112"
        },
        {
          "class": "java.lang.Thread",
          "method": "run",
          "file": "Thread.java",
          "line": 745,
          "exact": false,
          "location": "?",
          "version": "1.8.0_112"
        },
        {
          "class": "com.hazelcast.util.executor.HazelcastManagedThread",
          "method": "executeRun",
          "file": "HazelcastManagedThread.java",
          "line": 64,
          "exact": false,
          "location": "hazelcast-3.8.6.jar!/",
          "version": "3.8.6"
        },
        {
          "class": "com.hazelcast.util.executor.HazelcastManagedThread",
          "method": "run",
          "file": "HazelcastManagedThread.java",
          "line": 80,
          "exact": false,
          "location": "hazelcast-3.8.6.jar!/",
          "version": "3.8.6"
        },
        {
          "class": "------ submitted from ------",
          "line": -1,
          "exact": false,
          "location": "?",
          "version": "?"
        },
        {
          "class": "com.hazelcast.client.spi.impl.ClientInvocationFuture",
          "method": "resolveAndThrowIfException",
          "file": "ClientInvocationFuture.java",
          "line": 95,
          "exact": false,
          "location": "hazelcast-client-3.8.6.jar!/",
          "version": "3.8.6"
        },
        {
          "class": "com.hazelcast.client.spi.impl.ClientInvocationFuture",
          "method": "resolveAndThrowIfException",
          "file": "ClientInvocationFuture.java",
          "line": 32,
          "exact": false,
          "location": "hazelcast-client-3.8.6.jar!/",
          "version": "3.8.6"
        },
        {
          "class": "com.hazelcast.spi.impl.AbstractInvocationFuture",
          "method": "get",
          "file": "AbstractInvocationFuture.java",
          "line": 155,
          "exact": false,
          "location": "hazelcast-3.8.6.jar!/",
          "version": "3.8.6"
        },
        {
          "class": "com.hazelcast.client.spi.ClientProxy",
          "method": "invokeOnPartition",
          "file": "ClientProxy.java",
          "line": 170,
          "exact": false,
          "location": "hazelcast-client-3.8.6.jar!/",
          "version": "3.8.6"
        },
        {
          "class": "com.hazelcast.client.proxy.PartitionSpecificClientProxy",
          "method": "invokeOnPartition",
          "file": "PartitionSpecificClientProxy.java",
          "line": 47,
          "exact": false,
          "location": "hazelcast-client-3.8.6.jar!/",
          "version": "3.8.6"
        },
        {
          "class": "com.hazelcast.client.proxy.ClientLockProxy",
          "method": "tryLock",
          "file": "ClientLockProxy.java",
          "line": 145,
          "exact": false,
          "location": "hazelcast-client-3.8.6.jar!/",
          "version": "3.8.6"
        },
        {
          "class": "com.hazelcast.client.proxy.ClientLockProxy",
          "method": "tryLock",
          "file": "ClientLockProxy.java",
          "line": 135,
          "exact": false,
          "location": "hazelcast-client-3.8.6.jar!/",
          "version": "3.8.6"
        },
        {
          "class": "customized.LeaderInitiator$LeaderSelector",
          "method": "call",
          "file": "LeaderInitiator.java",
          "line": 251,
          "exact": true,
          "version": "?"
        }

spring-integration-mqtt paho lib mismatch?

Hi, thanks for the mqtt adapters. I downloaded the latest paho jar, (org.eclipse.paho.client.mqttv3.jar) and am getting build errors, for example, unresolved import org.eclipse.paho.client.mqttv3.IMqttDeliveryToken in
MqttPahoMessageHandler. Am I using the wrong library?

Thanks,
John

Java DSL: Issue using recipientListRoute with defaultOutputChannelName

When using a recipientListRoute and when the defaultOutputChannel is set using a name, the defaultOutputChannel channel is not set because the AbstractMessageRouter#onInit is not called.

The error is located on the RecipientListRouter class that doesn't call the AbstractMessageRouter#onInit method

spring-integration-aws: spring-integration-core version is not updated

In build.gradle file the spring-integration version is found to be

springIntegrationVersion = '4.0.3.RELEASE'

But while including it from maven central, the published version still depends on older version of spring integration.

<groupId>org.springframework.integration</groupId>
<artifactId>spring-integration-aws</artifactId>
<version>0.5.0.RELEASE</version>
<name>Spring Integration AWS Support</name>

    <dependency>
      <groupId>org.springframework.integration</groupId>
      <artifactId>spring-integration-mail</artifactId>
      <version>2.2.2.RELEASE</version>
      <scope>compile</scope>
      <optional>true</optional>
    </dependency>
    <dependency>
      <groupId>org.springframework.integration</groupId>
      <artifactId>spring-integration-core</artifactId>
      <version>2.2.2.RELEASE</version>
      <scope>compile</scope>
    </dependency>

While the latest commit message says that the spring integration has been updated to use 4.0.3.RELEASE.

Because of this I'm also getting the following error:
Caused by: java.lang.ClassNotFoundException: org.springframework.integration.Message

I'm guessing the version published to Maven Central could be an older version. In that case, could you kindly republish it again.

Thanks for the wonderful work.

Enhancement - Ability to set the SMB min/max versions

This enhancement will leverage a feature in the jCIFS library to set the min/max SMB versions via configuration. Current implementation defaults to JCIFS properties jcifs.smb.client.minVersion="SMB1" and jcifs.smb.client.maxVersion="SMB210".

SMPP: Autoreconnect doesn't work

I try to see if the reconnection to the server works, so I kill my smpp server and restart it.
the client never manage to restore the connection.

see the log :
17:52:57 INFO o.j.s.AbstractSession - executeSendCommand
17:52:57 INFO o.j.s.AbstractSession - commandID:80000015 status:00000000 sequence:16
17:53:02 INFO o.j.s.AbstractSession - EnquireLinkSender.run() send
17:53:02 INFO o.j.s.AbstractSession - executeSendCommand
17:53:02 INFO o.j.s.AbstractSession - commandID:80000015 status:00000000 sequence:17
17:53:07 INFO o.j.s.AbstractSession - EnquireLinkSender.run() send
17:53:07 INFO o.j.s.AbstractSession - executeSendCommand
17:53:07 INFO o.j.s.AbstractSession - commandID:80000015 status:00000000 sequence:18
17:53:09 WARN o.j.s.SMPPSession - IOException while reading: null
17:53:09 WARN o.j.s.AbstractSession - closing enquireLinkSender Thread[EnquireLinkSender: org.jsmpp.session.SMPPSession@7b3dd80d,5,main]
17:53:09 WARN o.j.s.AbstractSession - closing enquireLinkSender alive:true daemon:true interrupted:false
17:53:09 INFO o.j.s.AbstractSession - JOINED!!!
17:53:09 WARN o.j.s.AbstractSession - closing enquireLinkSender Thread[EnquireLinkSender: org.jsmpp.session.SMPPSession@7b3dd80d,5,]
17:53:09 WARN o.j.s.AbstractSession - closing enquireLinkSender alive:false daemon:true interrupted:false
17:53:09 INFO o.j.s.SMPPSession - PDUReaderWorker stop
17:53:16 ERROR o.s.i.s.s.SmppSessionFactoryBean$AutoReconnectLifecycle - Error happened when trying to connect to localhost:20775. Cause: Failed connecting
17:53:23 ERROR o.s.i.s.s.SmppSessionFactoryBean$AutoReconnectLifecycle - Error happened when trying to connect to localhost:20775. Cause: Failed connecting
17:53:46 ERROR o.s.i.s.s.SmppSessionFactoryBean$AutoReconnectLifecycle - Error happened when trying to connect to localhost:20775. Cause: Failed connecting
17:53:53 ERROR o.s.i.s.s.SmppSessionFactoryBean$AutoReconnectLifecycle - Error happened when trying to connect to localhost:20775. Cause: Failed connecting
17:54:16 ERROR o.s.i.s.s.SmppSessionFactoryBean$AutoReconnectLifecycle - Error happened when trying to connect to localhost:20775. Cause: Failed connecting
17:54:23 ERROR o.s.i.s.s.SmppSessionFactoryBean$AutoReconnectLifecycle - Error happened when trying to connect to localhost:20775. Cause: Failed connecting
17:54:46 ERROR o.s.i.s.s.SmppSessionFactoryBean$AutoReconnectLifecycle - Error happened when trying to connect to localhost:20775. Cause: Failed connecting
17:54:53 ERROR o.s.i.s.s.SmppSessionFactoryBean$AutoReconnectLifecycle - Error happened when trying to connect to localhost:20775. Cause: Failed connecting
17:55:16 ERROR o.s.i.s.s.SmppSessionFactoryBean$AutoReconnectLifecycle - Error happened when trying to connect to localhost:20775. Cause: Failed connecting
17:55:23 ERROR o.s.i.s.s.SmppSessionFactoryBean$AutoReconnectLifecycle - Error happened when trying to connect to localhost:20775. Cause: Failed connecting
17:55:46 ERROR o.s.i.s.s.SmppSessionFactoryBean$AutoReconnectLifecycle - Error happened when trying to connect to localhost:20775. Cause: Failed connecting
17:55:53 ERROR o.s.i.s.s.SmppSessionFactoryBean$AutoReconnectLifecycle - Error happened when trying to connect to localhost:20775. Cause: Failed connecting
17:56:16 ERROR o.s.i.s.s.SmppSessionFactoryBean$AutoReconnectLifecycle - Error happened when trying to connect to localhost:20775. Cause: Failed connecting
17:56:23 ERROR o.s.i.s.s.SmppSessionFactoryBean$AutoReconnectLifecycle - Error happened when trying to connect to localhost:20775. Cause: Failed connecting

Maven coordinates

Would you please point me to the maven repository hosting these artifacts ?
Thanks

[splunk] Update to latest splunk.jar (1.2)

v1.0 has an important bug where one can't login (!).

Problem is in Service class:

    @Override public ResponseMessage send(String path, RequestMessage request) {
        request.getHeader().put("Authorization", token);
        return super.send(fullpath(path), request);
    }

vs.

    @Override public ResponseMessage send(String path, RequestMessage request) {
        if (token != null) {
            request.getHeader().put("Authorization", token);
        }
        return super.send(fullpath(path), request);
    }

I would issue a PR, but apparently, splunk.jar is managed by hand here, so can't do much.

zip extension with DSL

I'm trying to use the milestone build of the zip integration and am not having any success with the unzip result splitter flow I've created using one of the tests in the repo.

Here is my interpretation of the configuration:

   @Bean
   public IntegrationFlow fileWritingFlow() {
        return IntegrationFlows.from(unzipInboundResultChannel()).wireTap(f -> f.handle(logger()))
                .handle(Files.outboundAdapter("'${workDir}/' + headers['zip_entryPath']").autoCreateDirectory(true))
                .get();
    }

    public UnZipTransformer unzipTransformer() {
        UnZipTransformer unzipTransformer = new UnZipTransformer();
        unzipTransformer.setZipResultType(ZipResultType.BYTE_ARRAY);
        return unzipTransformer;
    }

    public UnZipResultSplitter unZipResultSplitter() {
        return new UnZipResultSplitter();
    }

    @Bean
    public MessageChannel zipInboundResultChannel() {
        return MessageChannels.queue("zipInboundResultChannel").get();
    }

    @Bean
    public MessageChannel unzipInboundResultChannel() {
        return MessageChannels.queue("unzipInboundResultChannel").get();
    }

    @Bean
    public IntegrationFlow unzipFlow() {
        return IntegrationFlows.from(zipInboundResultChannel())
                .transform(unzipTransformer())
                .split(unZipResultSplitter(), "splitUnzippedMap")
                .channel(unzipInboundResultChannel())
                .get();
    }

The expression using workDir is not being interpreted correctly (it results in a directory named '${workDir}' ) but it also doesn't result in the files being written out to the directory even when I use new Files('/tmp') instead (in that case, I just get the zip file itself). Is there anything I'm configuring incorrectly?

spring-integration-splunk configuration

Guys,

I'm using:

        <groupId>org.springframework.integration</groupId>
        <artifactId>spring-integration-splunk</artifactId>
        <version>1.0.0.M1</version>

And getting the exception:

org.springframework.integration.MessagingException: search Splunk data failed at org.springframework.integration.splunk.support.SplunkExecutor.poll(SplunkExecutor.java:74) at org.springframework.integration.splunk.inbound.SplunkPollingChannelAdapter.receive(SplunkPollingChannelAdapter.java:66) at com.rbi.bdd.steps.SplunkVZTIntTestingStep.thenTheVZTServiceTransactionLogsShouldPresentOnSplunk(SplunkVZTIntTestingStep.java:45) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.jbehave.core.steps.StepCreator$ParameterisedStep.perform(StepCreator.java:550) at org.jbehave.core.embedder.StoryRunner$FineSoFar.run(StoryRunner.java:499) at org.jbehave.core.embedder.StoryRunner.runStepsWhileKeepingState(StoryRunner.java:479) at org.jbehave.core.embedder.StoryRunner.runScenarioSteps(StoryRunner.java:443) at org.jbehave.core.embedder.StoryRunner.runCancellable(StoryRunner.java:305) at org.jbehave.core.embedder.StoryRunner.run(StoryRunner.java:219) at org.jbehave.core.embedder.StoryRunner.run(StoryRunner.java:180) at org.jbehave.core.embedder.StoryManager$EnqueuedStory.call(StoryManager.java:229) at org.jbehave.core.embedder.StoryManager$EnqueuedStory.call(StoryManager.java:201) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:744) Caused by: java.lang.RuntimeException: could not connect to Splunk Server @ RBHQLKONDRATE-D:8089 - com.splunk.HttpException: HTTP 400 at org.springframework.integration.splunk.support.SplunkServiceFactory.getService(SplunkServiceFactory.java:81) at org.springframework.integration.splunk.support.SplunkDataReader.runQuery(SplunkDataReader.java:288) at org.springframework.integration.splunk.support.SplunkDataReader.nonBlockingSearch(SplunkDataReader.java:317) at org.springframework.integration.splunk.support.SplunkDataReader.read(SplunkDataReader.java:188) at org.springframework.integration.splunk.support.SplunkExecutor.poll(SplunkExecutor.java:70) ... 19 more
All the properties are set correctly and if I debug the code down to spring-integration-splunk code I see that the code fails because it tries to get Service from SplunkServiceFactory, but the service is null. I see that it should be set by the platform itself, so I can't help it.

Please, help me to understand how to resolve the issue.

Thanks,
Laura.

AmqpOutboundEndpointSpec missing method

Hello,

I'm trying to migrate

<int-amqp:outbound-channel-adapter channel="newProposalOutChannel"
                                   amqp-template="amqpTemplate"
                                   exchange-name="ib.underwriting.proposal.new"/>

with the dsl but I don't see how to specify the channel:

@Bean
public AmqpOutboundEndpoint newProposalOutChannelAdapter() {
    return Amqp.outboundAdapter(amqpTemplate)
            .exchangeName("ib.underwriting.proposal.new")
            .get();
}

Did I missed something? other xml attributes have counterparts but not for channel.

Thank you!

Question - Java-DSL Transaction

Hi,

I am trying to set a transaction on a poller and I don't know how to do it. Actually the api doesn't allow to configure a Poller when the channel is set as String

// Not able to to that
 IntegrationFlows.from("myChannel", c -> c.poller(Poolers......)

Thanks for your input

Inbound poller breaks when file gets modified while reading

we have a use case, where we are reading the file from windows remote location with the following steps:

  1. Read file from windows share location say input folder.
  2. Usually files are very large in size ( > 100mb ) and poller takes time to download it.
  3. When poller is downloading files, and at the same time if user updates/modifies ( copy & paste the same file ) the source file then poller breaks and moves to next step in flow with half downloaded file.

Please see this link for more details http://stackoverflow.com/questions/43389062/spring-integration-inbound-poller-breaks-when-file-gets-modified-while-reading?noredirect=1#comment74369732_43389062

Splunk: Real-Time search not working

I've been having trouble getting real-time searches to work- the connector launches correctly but then doesn't output results. I think that I've chased the problem down to the runQuery method in SplunkDataReader- it launches a search and then waits for the job to end before pulling in events. Since the real-time search never ends the connector sits and blocks while waiting.

I've tested this by launching a search and then finalizing the job from within the Splunk jobs interface. As soon as the job is finalized, all data is correctly delivered into the integration pipeline.

Java DSL and ContentEnricher

Hi,

I create a class that extends ContentEnricher and I would like to use it in my flow. I am only able to use it with handle method and not with enrich method.

Works with

return IntegrationFlows.from("myChannel")
                .handle(myCustomEnricher)

Doesn't work with

return IntegrationFlows.from("myChannel")
                .enrich(myCustomEnricher)

Kafka: regex validation exception when using configuration property setting for topic in int-kafka:topic-filter

if one puts Spring property setting such as pattern="${topic}" (see below) spring framework will perform validation and not perform substitution (property for value)

Spring xml

<int-kafka:consumer-context id="consumerContext"
                consumer-timeout="${kafka.consumer.timeout}"
                zookeeper-connect="zookeeperConnect">
                <int-kafka:consumer-configurations>
                        <int-kafka:consumer-configuration
                                group-id="default" 
                                key-decoder="kafkaReflectionDecoder"
                                value-decoder="kafkaReflectionDecoder">
                                <!-- max-messages="5000"> -->
                                <int-kafka:topic-filter 
                                    pattern="${topic}"
                                        streams="1" 
                                        exclude="false" />
                        </int-kafka:consumer-configuration>
                </int-kafka:consumer-configurations>
        </int-kafka:consumer-context>

Exception:

org.springframework.beans.factory.BeanDefinitionStoreException: Unexpected exception parsing XML document from class path resource [wda/spring-context-kafka.xml]; nested exception is java.lang.RuntimeException: ${kafka.consumer.topic} is an invalid regex.
    at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.doLoadBeanDefinitions(XmlBeanDefinitionReader.java:413) ~[storm.kafka-1.0.jar:na]
    at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:335) ~[storm.kafka-1.0.jar:na]
    at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:303) ~[storm.kafka-1.0.jar:na]
    at org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:180) ~[storm.kafka-1.0.jar:na]
    at org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:216) ~[storm.kafka-1.0.jar:na]
    at org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:187) ~[storm.kafka-1.0.jar:na]
    at org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:251) ~[storm.kafka-1.0.jar:na]
    at org.springframework.context.support.AbstractXmlApplicationContext.loadBeanDefinitions(AbstractXmlApplicationContext.java:127) ~[storm.kafka-1.0.jar:na]
    at org.springframework.context.support.AbstractXmlApplicationContext.loadBeanDefinitions(AbstractXmlApplicationContext.java:93) ~[storm.kafka-1.0.jar:na]
    at org.springframework.context.support.AbstractRefreshableApplicationContext.refreshBeanFactory(AbstractRefreshableApplicationContext.java:129) ~[storm.kafka-1.0.jar:na]
    at org.springframework.context.support.AbstractApplicationContext.obtainFreshBeanFactory(AbstractApplicationContext.java:540) ~[storm.kafka-1.0.jar:na]
    at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:454) ~[storm.kafka-1.0.jar:na]
    at com.xplusone.storm.kafka.spout.WDAKafkaSpout.open(WDAKafkaSpout.java:98) ~[storm.kafka-1.0.jar:na]
    at backtype.storm.daemon.executor$fn__3430$fn__3445.invoke(executor.clj:504) ~[storm-core-0.9.0.1.jar:na]
    at backtype.storm.util$async_loop$fn__444.invoke(util.clj:401) ~[storm-core-0.9.0.1.jar:na]
    at clojure.lang.AFn.run(AFn.java:24) [clojure-1.4.0.jar:na]
    at java.lang.Thread.run(Thread.java:744) [na:1.7.0_45]
Caused by: java.lang.RuntimeException: ${kafka.consumer.topic} is an invalid regex.
    at kafka.consumer.TopicFilter.<init>(TopicFilter.scala:39) ~[storm.kafka-1.0.jar:na]
    at kafka.consumer.Whitelist.<init>(TopicFilter.scala:49) ~[storm.kafka-1.0.jar:na]
    at org.springframework.integration.kafka.support.TopicFilterConfiguration.<init>(TopicFilterConfiguration.java:30) ~[storm.kafka-1.0.jar:na]
    at org.springframework.integration.kafka.config.xml.KafkaConsumerContextParser.parseConsumerConfigurations(KafkaConsumerContextParser.java:84) ~[storm.kafka-1.0.jar:na]
    at org.springframework.integration.kafka.config.xml.KafkaConsumerContextParser.doParse(KafkaConsumerContextParser.java:49) ~[storm.kafka-1.0.jar:na]
    at org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser.parseInternal(AbstractSingleBeanDefinitionParser.java:85) ~[storm.kafka-1.0.jar:na]
    at org.springframework.beans.factory.xml.AbstractBeanDefinitionParser.parse(AbstractBeanDefinitionParser.java:60) ~[storm.kafka-1.0.jar:na]
    at org.springframework.beans.factory.xml.NamespaceHandlerSupport.parse(NamespaceHandlerSupport.java:74) ~[storm.kafka-1.0.jar:na]
    at org.springframework.integration.config.xml.AbstractIntegrationNamespaceHandler.parse(AbstractIntegrationNamespaceHandler.java:63) ~[storm.kafka-1.0.jar:na]
    at org.springframework.beans.factory.xml.BeanDefinitionParserDelegate.parseCustomElement(BeanDefinitionParserDelegate.java:1424) ~[storm.kafka-1.0.jar:na]
    at org.springframework.beans.factory.xml.BeanDefinitionParserDelegate.parseCustomElement(BeanDefinitionParserDelegate.java:1414) ~[storm.kafka-1.0.jar:na]
    at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.parseBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:187) ~[storm.kafka-1.0.jar:na]
    at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.doRegisterBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:141) ~[storm.kafka-1.0.jar:na]
    at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.registerBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:110) ~[storm.kafka-1.0.jar:na]
    at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.registerBeanDefinitions(XmlBeanDefinitionReader.java:508) ~[storm.kafka-1.0.jar:na]
    at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.doLoadBeanDefinitions(XmlBeanDefinitionReader.java:391) ~[storm.kafka-1.0.jar:na]
    ... 16 common frames omitted

Kafka: pass zookeeper namespace information

Hi,
my topics are unders a particular namespace /kafka-production/brokers/topics/notification_development. Is there a way to pass in /kafka-production/ to the consumer. I believe my consumer is trying to look for the topic under the ROOT path in Zookeeper because of which no message is being consumed

<int-kafka:consumer-context id="consumerContext" consumer-timeout="10000" zookeeper-connect="zookeeperConnect" consumer-properties="consumerProperties">
    <int-kafka:consumer-configurations>
        <int-kafka:consumer-configuration group-id="notification_consumer_development" max-messages="1">
                <int-kafka:topic id="*.notification_development" streams="1"/>
        </int-kafka:consumer-configuration>
    </int-kafka:consumer-configurations>
</int-kafka:consumer-context>

Thanks
-Parshu

ZipTransformer: "deleteFiles" throwing exception

Issue:
deleteFiles=true causing below exception .

org.zeroturnaround.zip.ZipException: java.io.FileNotFoundException: <Abosulate Path of File with File name>The system cannot find the file specified)

Reason:
Inside ZipTransformer, deleteFiles is used to delete the original file before corresponding FileSource is used to compress the file.
That is causing FileNotFoundException.

Autowired doesn't work with spring-integration-java-DSL

I created a simple Integration flow using an handler. The handler is autowired in the MyIntegrationFlow class. The handler uses MyService bean that is also autowired.

The problem comes when I tried to use the MyService bean. At this point, the bean is NULL.

MyIntegrationFlow file

@Configuration
public class MyIntegrationFlow {
@Autowired
private MyHandleGateway myHandleGateway;

@Bean
    public IntegrationFlow cceWsGetNotificationRequestFlow() {
        return IntegrationFlows.from(CceServiceConstant.CCE_WS_RECEIVE_MESSAGE_CHANNEL)
   .handle(myHandleGateway)
   .get();
}

MyHandleGateway file

@Component
public class MyHandleGateway extends AbstractReplyProducingMessageHandler {

  @Autowired
   private MyService myService;

   protected Object handleRequestMessage(Message<?> requestMessage) {
       .....
   }
}

Spring integration Kafka Sample not working.

Hi,
I am trying to use very basic String integration Kafka outbound flow. I am referring the code sample at

https://github.com/spring-projects/spring-integration-extensions/tree/master/samples/kafka

Error creating bean with name 'producerMetadata_test2': Initialization of bean failed; nested exception is java.lang.reflect.MalformedParameterizedTypeException

I think its failing due to below line in KafkaProducerContextParser.

producerConfigurationBuilder.addConstructorArgReference("prodFactory_" + producerConfiguration.getAttribute("topic"));

prodFactory_topic2 is a bean of type ProducerFactoryBean where as ProducerConfiguration (producerConfigurationBuilder) expects reference of "kafka.javaapi.producer.Producer" as 2nd Constructor parameter.

I think we need to specify (call) setFactoryMethod on producerFactoryBuilder to with "getObject". I am not sure about it though.

Please see attached configuration I am using.

image

Can't configure channel for SmppInboundAdapter using @InboundChannelAdapter annotation

I'm trying to configure an SmppInboundChannelAdapter using java config, and it doesn't let me configure the channel using the InboundChannelAdapter(value="fooChannel") annotation, which I understand to be the normal way of configuring these (see JdbcPollingChannelAdapter example at https://docs.spring.io/spring-integration/reference/html/overview.html ).

I've been able to configure it using the following:

@Bean
public SmppInboundChannelAdapter smppInboundChannelAdapter() {
	SmppInboundChannelAdapter smppInboundChannelAdapter = new SmppInboundChannelAdapter();
	try {
		smppInboundChannelAdapter.setSmppSession(smppSessionFactory().getObject());
		smppInboundChannelAdapter.setChannel(context.getBean("smsChannel", MessageChannel.class));
	} catch (Exception e) {
		e.printStackTrace();
	}
	return smppInboundChannelAdapter;
}

But this is rather ugly and it would be good if it could be configured using the annotation.

Looking at the adapter code, it appears to extend AbstractEndpoint and implement it's own "setChannel" method.

It would appear to be better to extend org.springframework.integration.endpoint.MessageProducerSupport instead, which provides methods for setOutputChannel, setOutputChannelName, etc. which appear to be what is expected by the annotation.

zip: Message header for ZIP_ENTRY_PATH is always empty

I'm trying to use the ZIP_ENTRY_PATH value but I notice it's always empty when setting a breakpoint in UnZipResultSplitter. I'm guessing that this is because I'm using the BYTE_ARRAY zip result type on the zip transformer but it would be nice if there were some way to preserve the containing file's name.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.