GithubHelp home page GithubHelp logo

kafka-beginners-course's People

Contributors

aureliemarcuzzo avatar darren-rose avatar martinezcarlos avatar ryancastle avatar sderosiaux avatar simplesteph avatar stuzanna avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

kafka-beginners-course's Issues

Kafka Js Webpage consumer

Hi,
Congratulations on your work!!

I saw your videos and learned a lot but my goal was not successful...
How i can consume a topic from a webpage with html and js file or php and js file. With this i will want change dom or show alert when consume some data...

Like pusher client, they have this
image

I search for this, but we only have a nodejs consumer...

Do you have some tutorial or video?

Thanks in advance
Best regards

Getting openSearch exception using XContentType.JSON


import org.apache.http.HttpHost;
import org.apache.http.auth.AuthScope;
import org.apache.http.auth.UsernamePasswordCredentials;
import org.apache.http.client.CredentialsProvider;
import org.apache.http.impl.client.BasicCredentialsProvider;
import org.apache.http.impl.client.DefaultConnectionKeepAliveStrategy;
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.apache.kafka.clients.consumer.ConsumerRecords;
import org.apache.kafka.clients.consumer.KafkaConsumer;
import org.apache.kafka.common.serialization.StringDeserializer;
import org.opensearch.action.index.IndexRequest;
import org.opensearch.action.index.IndexResponse;
import org.opensearch.client.RequestOptions;
import org.opensearch.client.RestClient;
import org.opensearch.client.RestHighLevelClient;
import org.opensearch.client.indices.CreateIndexRequest;
import org.opensearch.client.indices.GetIndexRequest;
import org.opensearch.common.xcontent.XContentType;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import java.io.IOException;
import java.net.URI;
import java.time.Duration;
import java.util.Collections;
import java.util.Properties;

public class OpenSearchConsumer {

    private static final Logger log = LoggerFactory.getLogger(OpenSearchConsumer.class);

    private static RestHighLevelClient createOpenSearchClient(){
        // local openSearch server running
        String connection = "http://localhost:9200/";

        // Build a URI from the connection string
        RestHighLevelClient restHighLevelClient;
        URI connectionURI = URI.create(connection);

        //Extract login info
        String userInfo = connectionURI.getUserInfo();
        if(userInfo==null){
            // Rest Client without security
            restHighLevelClient = new RestHighLevelClient(RestClient.builder(
                    new HttpHost(connectionURI.getHost(), connectionURI.getPort(),"http")));
        }else{
            // Rest Client with security
            String[] auth = userInfo.split(":");
            CredentialsProvider provider = new BasicCredentialsProvider();
            provider.setCredentials(AuthScope.ANY,new UsernamePasswordCredentials(auth[0],auth[1]));
            restHighLevelClient = new RestHighLevelClient(
                    RestClient.builder(new HttpHost(connectionURI.getHost(),connectionURI.getPort(),connectionURI.getScheme()))
                    .setHttpClientConfigCallback(
                            httpAsyncClientBuilder -> httpAsyncClientBuilder.setDefaultCredentialsProvider(provider)
                                    .setKeepAliveStrategy(new DefaultConnectionKeepAliveStrategy())));
        }
        return restHighLevelClient;
    }
    private static KafkaConsumer<String,String> createKafkaConsumer(){
        String bootstarpServer = "http://localhost:9092";
        String groupId = "opensearch-consumer";

        Properties properties = new Properties();
        properties.setProperty(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,bootstarpServer);
        properties.setProperty(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());
        properties.setProperty(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,StringDeserializer.class.getName());
        properties.setProperty(ConsumerConfig.GROUP_ID_CONFIG,groupId);
        properties.setProperty(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG,"latest");
       // properties.setProperty(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, "false");

        return new KafkaConsumer<String,String>(properties);
    }
    public static void main(String[] args) {

        // Create an opensearch Client
        RestHighLevelClient openSearchClient = createOpenSearchClient();

        // Create a Kafka Client
        KafkaConsumer<String,String> consumer = createKafkaConsumer();

        try(openSearchClient; consumer) {

            boolean isIndexExists = openSearchClient.indices().exists(new GetIndexRequest("wikimedia"),RequestOptions.DEFAULT);

            if (!isIndexExists) {
                // Create an index on opensearch if doesn't exists
                CreateIndexRequest createIndexRequest = new CreateIndexRequest("wikimedia");
                openSearchClient.indices().create(createIndexRequest, RequestOptions.DEFAULT);
                log.info("Wikimedia index has been created.");
            }else{
                log.info("Index already exists.");
            }
            // subscribe the consumer
            consumer.subscribe(Collections.singleton("wikimedia.recentchange"));

            while (true){
                ConsumerRecords<String,String> records = consumer.poll(Duration.ofMillis(3000));
                int recordCount = records.count();
                log.info("Received: ",recordCount," records");
                for (ConsumerRecord<String, String> record : records) {//send each record to opensearch
                    IndexRequest indexRequest = new IndexRequest("wikimedia")
                            .source(record.value(), XContentType.JSON);
                    IndexResponse response = openSearchClient.index(indexRequest, RequestOptions.DEFAULT);
                    log.info("Inserted document with id: ", response.getId(), " into opensearch");

                }
            }

        } catch (IOException e) {
            log.error("Error creating index");
        }
    }
`}`

Caused by: OpenSearchException[OpenSearch exception [type=not_x_content_exception, reason=Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes]]

Error message: Thread is already running .

package twitter.kafka;

import com.google.common.collect.Lists;
import com.twitter.hbc.ClientBuilder;
import com.twitter.hbc.core.Client;
import com.twitter.hbc.core.Constants;
import com.twitter.hbc.core.Hosts;
import com.twitter.hbc.core.HttpHosts;
import com.twitter.hbc.core.endpoint.StatusesFilterEndpoint;
import com.twitter.hbc.core.processor.StringDelimitedProcessor;
import com.twitter.hbc.httpclient.auth.Authentication;
import com.twitter.hbc.httpclient.auth.OAuth1;
import org.slf4j.LoggerFactory;

import java.util.List;
import java.util.concurrent.BlockingQueue;
import java.util.concurrent.LinkedBlockingQueue;
import java.util.concurrent.TimeUnit;
import java.util.logging.Logger;

public class TwitterProducer {

org.slf4j.Logger logger=LoggerFactory.getLogger(TwitterProducer.class.getName());

public static void main(String[] args) {
new TwitterProducer().run();
}

public void run(){
    /** Set up your blocking queues: Be sure to size these properly based on expected TPS of your stream */
    BlockingQueue<String> msgQueue = new LinkedBlockingQueue<String>(1000);


    Client client=createTwitterClient(msgQueue);
client.connect();
//loop to send tweets
    while (!client.isDone()) {
        String msg = null;
        try {
            msg = msgQueue.poll(5, TimeUnit.SECONDS);
        } catch (InterruptedException e) {
            e.printStackTrace();
            client.stop();
        }
        if(msg !=null)
        {
        logger.info(msg);
        }
        logger.info("Exit\n");
    }
}

String ConsumerKey="l5FqN7hy1EsY2SgUJhnkqhmw9";
String ConsumerSecret="LZJ48jU5oTtYnnW4rLqlyAr0xOvDEyizPjsrXDoKkKeHfG7t34";
String token="733920087296507908-7vE4BiMe8xnjT7L0GMMh38x0EFjyItO";
String secret="Z99utconKnWkPX2Hxcp6wxEKnvBXANlSVymk8E3BIZaWF";
public Client createTwitterClient(BlockingQueue<String> msgQueue){
     /** Declare the host you want to connect to, the endpoint, and authentication (basic auth or oauth) */
    Hosts hosebirdHosts = new HttpHosts(Constants.STREAM_HOST);
    StatusesFilterEndpoint hosebirdEndpoint = new StatusesFilterEndpoint();
    // Optional: set up some followings and track terms
    List<String> terms = Lists.newArrayList("Palghar");
    hosebirdEndpoint.trackTerms(terms);

    // These secrets should be read from a config file
    Authentication hosebirdAuth = new OAuth1(ConsumerKey, ConsumerSecret, token, secret);

    ClientBuilder builder = new ClientBuilder()
            .name("HoseBirdClient-01")                              // optional: mainly for the logs
            .hosts(hosebirdHosts)
            .authentication(hosebirdAuth)
            .endpoint(hosebirdEndpoint)
            .processor(new StringDelimitedProcessor(msgQueue));                          // optional: use this if you want to process client events

    Client hosebirdClient = builder.build();

// Attempts to establish a connection.
hosebirdClient.connect();
return hosebirdClient;
}
}

Error:-
"C:\Program Files\Java\jdk1.8.0_221\bin\java" "-javaagent:D:\123\IntelliJ IDEA 2017.2.2\lib\idea_rt.jar=51166:D:\123\IntelliJ IDEA 2017.2.2\bin" -Dfile.encoding=UTF-8 -classpath "C:\Program Files\Java\jdk1.8.0_221\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.8.0_221\jre\lib\deploy.jar;C:\Program Files\Java\jdk1.8.0_221\jre\lib\ext\access-bridge-64.jar;C:\Program Files\Java\jdk1.8.0_221\jre\lib\ext\cldrdata.jar;C:\Program Files\Java\jdk1.8.0_221\jre\lib\ext\dnsns.jar;C:\Program Files\Java\jdk1.8.0_221\jre\lib\ext\jaccess.jar;C:\Program Files\Java\jdk1.8.0_221\jre\lib\ext\jfxrt.jar;C:\Program Files\Java\jdk1.8.0_221\jre\lib\ext\localedata.jar;C:\Program Files\Java\jdk1.8.0_221\jre\lib\ext\nashorn.jar;C:\Program Files\Java\jdk1.8.0_221\jre\lib\ext\sunec.jar;C:\Program Files\Java\jdk1.8.0_221\jre\lib\ext\sunjce_provider.jar;C:\Program Files\Java\jdk1.8.0_221\jre\lib\ext\sunmscapi.jar;C:\Program Files\Java\jdk1.8.0_221\jre\lib\ext\sunpkcs11.jar;C:\Program Files\Java\jdk1.8.0_221\jre\lib\ext\zipfs.jar;C:\Program Files\Java\jdk1.8.0_221\jre\lib\javaws.jar;C:\Program Files\Java\jdk1.8.0_221\jre\lib\jce.jar;C:\Program Files\Java\jdk1.8.0_221\jre\lib\jfr.jar;C:\Program Files\Java\jdk1.8.0_221\jre\lib\jfxswt.jar;C:\Program Files\Java\jdk1.8.0_221\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.8.0_221\jre\lib\management-agent.jar;C:\Program Files\Java\jdk1.8.0_221\jre\lib\plugin.jar;C:\Program Files\Java\jdk1.8.0_221\jre\lib\resources.jar;C:\Program Files\Java\jdk1.8.0_221\jre\lib\rt.jar;D:\Study Material only\kafka\project\target\classes;C:\Users\rudra.m2\repository\org\apache\kafka\kafka-clients\2.5.0\kafka-clients-2.5.0.jar;C:\Users\rudra.m2\repository\com\github\luben\zstd-jni\1.4.4-7\zstd-jni-1.4.4-7.jar;C:\Users\rudra.m2\repository\org\lz4\lz4-java\1.7.1\lz4-java-1.7.1.jar;C:\Users\rudra.m2\repository\org\xerial\snappy\snappy-java\1.1.7.3\snappy-java-1.1.7.3.jar;C:\Users\rudra.m2\repository\org\slf4j\slf4j-simple\1.7.21\slf4j-simple-1.7.21.jar;C:\Users\rudra.m2\repository\org\apache\logging\log4j\log4j-slf4j-impl\2.11.2\log4j-slf4j-impl-2.11.2.jar;C:\Users\rudra.m2\repository\org\apache\logging\log4j\log4j-api\2.11.2\log4j-api-2.11.2.jar;C:\Users\rudra.m2\repository\org\apache\logging\log4j\log4j-core\2.11.2\log4j-core-2.11.2.jar;C:\Users\rudra.m2\repository\org\slf4j\slf4j-api\1.6.4\slf4j-api-1.6.4.jar;C:\Users\rudra.m2\repository\org\apache\cassandra\cassandra-all\0.8.1\cassandra-all-0.8.1.jar;C:\Users\rudra.m2\repository\com\google\guava\guava\r08\guava-r08.jar;C:\Users\rudra.m2\repository\commons-cli\commons-cli\1.1\commons-cli-1.1.jar;C:\Users\rudra.m2\repository\commons-codec\commons-codec\1.2\commons-codec-1.2.jar;C:\Users\rudra.m2\repository\commons-collections\commons-collections\3.2.1\commons-collections-3.2.1.jar;C:\Users\rudra.m2\repository\commons-lang\commons-lang\2.4\commons-lang-2.4.jar;C:\Users\rudra.m2\repository\com\googlecode\concurrentlinkedhashmap\concurrentlinkedhashmap-lru\1.1\concurrentlinkedhashmap-lru-1.1.jar;C:\Users\rudra.m2\repository\org\antlr\antlr\3.2\antlr-3.2.jar;C:\Users\rudra.m2\repository\org\antlr\antlr-runtime\3.2\antlr-runtime-3.2.jar;C:\Users\rudra.m2\repository\org\antlr\stringtemplate\3.2\stringtemplate-3.2.jar;C:\Users\rudra.m2\repository\antlr\antlr\2.7.7\antlr-2.7.7.jar;C:\Users\rudra.m2\repository\org\apache\cassandra\deps\avro\1.4.0-cassandra-1\avro-1.4.0-cassandra-1.jar;C:\Users\rudra.m2\repository\org\mortbay\jetty\jetty\6.1.22\jetty-6.1.22.jar;C:\Users\rudra.m2\repository\org\mortbay\jetty\jetty-util\6.1.22\jetty-util-6.1.22.jar;C:\Users\rudra.m2\repository\org\mortbay\jetty\servlet-api\2.5-20081211\servlet-api-2.5-20081211.jar;C:\Users\rudra.m2\repository\org\codehaus\jackson\jackson-core-asl\1.4.0\jackson-core-asl-1.4.0.jar;C:\Users\rudra.m2\repository\org\codehaus\jackson\jackson-mapper-asl\1.4.0\jackson-mapper-asl-1.4.0.jar;C:\Users\rudra.m2\repository\jline\jline\0.9.94\jline-0.9.94.jar;C:\Users\rudra.m2\repository\com\googlecode\json-simple\json-simple\1.1\json-simple-1.1.jar;C:\Users\rudra.m2\repository\com\github\stephenc\high-scale-lib\high-scale-lib\1.1.2\high-scale-lib-1.1.2.jar;C:\Users\rudra.m2\repository\org\yaml\snakeyaml\1.6\snakeyaml-1.6.jar;C:\Users\rudra.m2\repository\org\apache\thrift\libthrift\0.6.1\libthrift-0.6.1.jar;C:\Users\rudra.m2\repository\junit\junit\4.4\junit-4.4.jar;C:\Users\rudra.m2\repository\javax\servlet\servlet-api\2.5\servlet-api-2.5.jar;C:\Users\rudra.m2\repository\org\apache\cassandra\cassandra-thrift\0.8.1\cassandra-thrift-0.8.1.jar;C:\Users\rudra.m2\repository\com\github\stephenc\jamm\0.2.2\jamm-0.2.2.jar;C:\Users\rudra.m2\repository\com\twitter\hbc-core\2.2.0\hbc-core-2.2.0.jar;C:\Users\rudra.m2\repository\org\apache\httpcomponents\httpclient\4.2.5\httpclient-4.2.5.jar;C:\Users\rudra.m2\repository\org\apache\httpcomponents\httpcore\4.2.4\httpcore-4.2.4.jar;C:\Users\rudra.m2\repository\commons-logging\commons-logging\1.1.1\commons-logging-1.1.1.jar;C:\Users\rudra.m2\repository\com\twitter\joauth\6.0.2\joauth-6.0.2.jar;C:\Users\rudra.m2\repository\com\google\code\findbugs\jsr305\1.3.9\jsr305-1.3.9.jar" twitter.kafka.TwitterProducer
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/Users/rudra/.m2/repository/org/slf4j/slf4j-simple/1.7.21/slf4j-simple-1.7.21.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/Users/rudra/.m2/repository/org/apache/logging/log4j/log4j-slf4j-impl/2.11.2/log4j-slf4j-impl-2.11.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
[main] INFO com.twitter.hbc.httpclient.BasicClient - New connection executed: HoseBirdClient-01, endpoint: /1.1/statuses/filter.json?delimited=length&stall_warnings=true
Exception in thread "main" java.lang.IllegalStateException: There is already a connection thread running for HoseBirdClient-01, endpoint: /1.1/statuses/filter.json?delimited=length&stall_warnings=true
at com.twitter.hbc.httpclient.BasicClient.connect(BasicClient.java:92)
at twitter.kafka.TwitterProducer.run(TwitterProducer.java:35)
at twitter.kafka.TwitterProducer.main(TwitterProducer.java:26)

Process finished with exit code 1

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.