KAFKA STREAM:Lib Rocks DB上的UnsatisfiedLinkError

时间:2018-04-23 09:36:47

标签: scala apache-kafka apache-kafka-streams rocksdb

我正在尝试使用kafka流来解决Word Count问题。我使用Kafka 1.1.0与scala版本2.11.12和sbt版本1.1.4。我收到了以下错误:

Exception in thread "wordcount-application-d81ee069-9307-46f1-8e71-c9f777d2db64-StreamThread-1" java.lang.UnsatisfiedLinkError: C:\Users\user\AppData\Local\Temp\librocksdbjni5439068356048679315.dll: À¦¥Y
at java.lang.ClassLoader$NativeLibrary.load(Native Method)
at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1941)
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
at java.lang.Runtime.load0(Runtime.java:809)
at java.lang.System.load(System.java:1086)
at org.rocksdb.NativeLibraryLoader.loadLibraryFromJar(NativeLibraryLoader.java:78)
at org.rocksdb.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:56)
at org.rocksdb.RocksDB.loadLibrary(RocksDB.java:64)
at org.rocksdb.RocksDB.<clinit>(RocksDB.java:35)
at org.rocksdb.Options.<clinit>(Options.java:25)
at org.apache.kafka.streams.state.internals.RocksDBStore.openDB(RocksDBStore.java:116)
at org.apache.kafka.streams.state.internals.RocksDBStore.init(RocksDBStore.java:167)
at org.apache.kafka.streams.state.internals.ChangeLoggingKeyValueBytesStore.init(ChangeLoggingKeyValueBytesStore.java:40)
at org.apache.kafka.streams.state.internals.CachingKeyValueStore.init(CachingKeyValueStore.java:63)
at org.apache.kafka.streams.state.internals.InnerMeteredKeyValueStore.init(InnerMeteredKeyValueStore.java:160)
at org.apache.kafka.streams.state.internals.MeteredKeyValueBytesStore.init(MeteredKeyValueBytesStore.java:102)
at org.apache.kafka.streams.processor.internals.AbstractTask.registerStateStores(AbstractTask.java:225)
at org.apache.kafka.streams.processor.internals.StreamTask.initializeStateStores(StreamTask.java:162)
at org.apache.kafka.streams.processor.internals.AssignedTasks.initializeNewTasks(AssignedTasks.java:88)
at org.apache.kafka.streams.processor.internals.TaskManager.updateNewAndRestoringTasks(TaskManager.java:316)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:789)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:750)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:720)

我已经尝试过这里给出的解决方案UnsatisfiedLinkError on Lib rocks DB dll when developing with Kafka Streams

这是我在scala中尝试的代码。

object WordCountApplication {

  def main(args: Array[String]) {
    val config: Properties = {
      val p = new Properties()
      p.put(StreamsConfig.APPLICATION_ID_CONFIG, "wordcount-application")
      p.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092")
      p.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass)
      p.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass)
      p
    }

    val builder: StreamsBuilder = new StreamsBuilder()
    val textLines: KStream[String, String] = builder.stream("streams-plaintext-input")

    val afterFlatMap: KStream[String, String] = textLines.flatMapValues(new ValueMapper[String,java.lang.Iterable[String]] {
      override def apply(value: String): lang.Iterable[String] = value.split("\\W+").toIterable.asJava
    })

    val afterGroupBy: KGroupedStream[String, String] = afterFlatMap.groupBy(new KeyValueMapper[String,String,String] {
      override def apply(key: String, value: String): String = value
    })


    val wordCounts: KTable[String, Long] = afterGroupBy
      .count(Materialized.as("counts-store").asInstanceOf[Materialized[String, Long, KeyValueStore[Bytes, Array[Byte]]]])
    wordCounts.toStream().to("streams-wordcount-output ", Produced.`with`(Serdes.String(), Serdes.Long()))

    val streams: KafkaStreams = new KafkaStreams(builder.build(), config)
    streams.start()

    Runtime.getRuntime.addShutdownHook(new Thread(
      new Runnable{
        override def run() = streams.close(10, TimeUnit.SECONDS)}
    ))
  }
}

Build.sbt

name := "KafkaStreamDemo"

version := "0.1"

scalaVersion := "2.11.12"

libraryDependencies ++= Seq(
  "org.apache.kafka" %% "kafka" % "1.1.0",
  "org.apache.kafka" % "kafka-clients" % "1.1.0",
  "org.apache.kafka" % "kafka-streams" % "1.1.0",
  "ch.qos.logback" % "logback-classic" % "1.2.3"
)

如果有人遇到这样的问题,请提供帮助。

2 个答案:

答案 0 :(得分:2)

最后,我找到了可行的答案。我关注了Unable to load rocksdbjni

我做了两件事对我有用。

1)我安装了Visual C++ Redistributable for Visual Studio 2015

2)之前,我使用的是带有kafka-streams 1.1.0的rocksdb 5.7.3(默认情况下,kafka-streams 1.1.0带有rocksdb 5.7.3)。我从kafka-streams依赖关系中排除了rockdb依赖关系,并安装了rockdb 5.3.6。作为参考,下面是我的built.sbt。

(defun c:bgm ( / cnt obj sel )
    (if (setq sel (ssget "_X" '((0 . "MTEXT") (-4 . "<OR") (90 . 1) (90 . 3) (-4 . "OR>"))))
        (repeat (setq cnt (sslength sel))
            (setq cnt (1- cnt)
                  obj (vlax-ename->vla-object (ssname sel cnt))
            )
            (vla-put-backgroundfill obj :vlax-false)
        )
    )
    (princ)
)
(vl-load-com) (princ)

希望,它可以帮助某人。

谢谢

答案 1 :(得分:0)

在我的情况下,我使用的是kafka-streams:1.0.2。将基本docker映像从alpine-jdk8:latest更改为openjdk:8-jre即可。

此链接-https://github.com/docker-flink/docker-flink/pull/22帮助我找到了此解决方案。

相关问题