Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No suitable driver found for jdbc:mysql #1419

Open
brekko1st opened this issue Aug 13, 2024 · 0 comments
Open

No suitable driver found for jdbc:mysql #1419

brekko1st opened this issue Aug 13, 2024 · 0 comments

Comments

@brekko1st
Copy link

brekko1st commented Aug 13, 2024

After 2 years I' trying to update our kafka-connect to latest version. Most of plugins works without any problems on new version except JdbcSinkConnector. I'm getting following error:

 No suitable driver found for jdbc:mysql://10.124.0.3:3306/kafka_testing\

I'm running it locally via docker compose. Here is the docker-compose file:

version: '3'
services:
  connect:
    image: test-connect
    container_name: connect
    hostname: connect
    ports:
      - "8083:8083"
    environment:
      CONNECT_LOG4J_ROOT_LOGLEVEL: INFO
      CONNECT_KEY_CONVERTER_SCHEMA_REGISTRY_URL: http://10.36.0.64:8081
      CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL: http://10.36.0.64:8081
      CONNECT_OFFSET_FLUSH_INTERVAL_MS: 10000
      CONNECT_REST_PORT: 8083
      CONNECT_INTERNAL_KEY_CONVERTER: org.apache.kafka.connect.json.JsonConverter
      CONNECT_INTERNAL_VALUE_CONVERTER: org.apache.kafka.connect.json.JsonConverter
      CONNECT_LOG4J_LOGGERS: org.apache.zookeeper=ERROR,org.I0Itec.zkclient=ERROR,org.reflections=ERROR
      CONNECT_CONFIG_PROVIDERS: file
      CONNECT_CONFIG_PROVIDERS_FILE_CLASS: org.apache.kafka.common.config.provider.FileConfigProvider
      CONNECT_BOOTSTRAP_SERVERS: 10.36.0.64:9092
      CONNECT_REST_ADVERTISED_HOST_NAME: 172.16.254.4
      CONNECT_GROUP_ID: dwh-uat3
      CONNECT_CONFIG_STORAGE_TOPIC: dwh-uat-configs3
      CONNECT_OFFSET_STORAGE_TOPIC: dwh-uat-offsets3
      CONNECT_STATUS_STORAGE_TOPIC: dwh-uat-status3
      CONNECT_KEY_CONVERTER: io.confluent.connect.avro.AvroConverter
      CONNECT_VALUE_CONVERTER: io.confluent.connect.avro.AvroConverter
      CONNECT_PLUGIN_PATH: /usr/share/java,/usr/share/confluent-hub-components
      CONNECT_REPLICATION_FACTOR: 1
      CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR: 1
      CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR: 1
      CONNECT_STATUS_STORAGE_REPLICATION_FACTOR: 1
      CLASSPATH: /usr/share/confluent-hub-components/confluentinc-kafka-connect-jdbc/lib/mysql-connector-java-8.0.13.jar:/usr/share/confluent-hub-components/confluentinc-kafka-connect-jdbc/lib/kafka-connect-jdbc-10.7.0.jar
    networks:
      priv-aeroflot-pilot-net:
        ipv4_address: 172.16.254.4

networks:
  priv-aeroflot-pilot-net:
    driver: bridge
    ipam:
      config:
      - subnet: 172.16.254.0/28

And here is dockerfile:

FROM confluentinc/cp-kafka-connect:7.6.2
RUN echo "===> MySQL Debezium connector ..."
RUN confluent-hub install --no-prompt debezium/debezium-connector-mysql:latest

RUN echo "===> Installing WePay BigQuery Connector ..."
RUN confluent-hub install --no-prompt wepay/kafka-connect-bigquery:2.5.6
RUN confluent-hub install --no-prompt confluentinc/connect-transforms:latest

RUN echo "===> MySQL JDBC connector ..."
RUN confluent-hub install --no-prompt confluentinc/kafka-connect-jdbc:10.7.0
COPY mysql-connector/mysql-connector-java-8.0.13.jar /usr/share/confluent-hub-components/confluentinc-kafka-connect-jdbc/lib

Here is the sink connector configuration:

curl -X POST -H "Content-Type: application/json" -d '{
  "name": "local_jdbc_test",
  "config": {
    "connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
    "connection.url": "jdbc:mysql://10.124.0.3:3306/kafka_testing",
    "connection.user": "kafkaconnect",
    "connection.password": "***********",
    "auto.create": "true",
    "auto.evolve": "true",
    "fields.whitelist": "id,tset_col_1,test_col_2",
    "insert.mode": "upsert",
    "pk.fields": "id",
    "pk.mode": "record_value",
    "table.name.format": "kafka_tessting_jdbc",
    "tasks.max": "1",
    "topics": "uat_mysql.kafka_testing.kafka_connect_test_3",
    "transforms": "ExtractAfter,Tombstone",
    "transforms.ExtractAfter.field": "after",
    "transforms.ExtractAfter.type": "org.apache.kafka.connect.transforms.ExtractField$Value",
    "transforms.Tombstone.behavior": "warn",
    "transforms.Tombstone.type": "io.confluent.connect.transforms.TombstoneHandler"
  }
}' http://localhost:8083/connectors/

I've checked documentation on:
https://docs.confluent.io/kafka-connectors/jdbc/current/jdbc-drivers.html
Checked this article:
https://www.confluent.io/blog/kafka-connect-deep-dive-jdbc-source-connector/
Stackoverflow and google searching.
I've tried different versions of mysq-connector.jar file. Try to push it into lib directory, component directory, java directory. Try different versions of kafka-connect-jdbc. Result is always the same. No suitable driver found. I've checked the logs and there is no info about registering the java driver.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant