1

I am trying to connect to Hive2 through a Java application but I am getting the following error -

Exception in thread "main" java.sql.SQLException: [Simba][HiveJDBCDriver](500310) Invalid operation: Peer indicated failure: Unsupported mechanism type PLAIN;

        at com.cloudera.hive.hivecommon.api.HiveServer2ClientFactory.createTransport(HiveServer2ClientFactory.java:224)
        at com.cloudera.hive.hive.api.ExtendedHS2Factory.createClient(ExtendedHS2Factory.java:38)
        at com.cloudera.hive.hivecommon.core.HiveJDBCConnection.connect(HiveJDBCConnection.java:597)
        at com.cloudera.hive.jdbc.common.BaseConnectionFactory.doConnect(BaseConnectionFactory.java:219)
        at com.cloudera.hive.jdbc.common.AbstractDriver.connect(AbstractDriver.java:216)
        at java.sql.DriverManager.getConnection(DriverManager.java:664)
        at java.sql.DriverManager.getConnection(DriverManager.java:247)
    Caused by: com.cloudera.hive.support.exceptions.GeneralException: [Simba][HiveJDBCDriver](500310) Invalid operation: Peer indicated failure: Unsupported mechanism type PLAIN;
        ... 7 more
    Caused by: org.apache.thrift.transport.TTransportException: Peer indicated failure: Unsupported mechanism type PLAIN
        at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:190)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:288)
        at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
        at com.cloudera.hive.hivecommon.api.HiveServer2ClientFactory.createTransport(HiveServer2ClientFactory.java:210)
        at com.cloudera.hive.hive.api.ExtendedHS2Factory.createClient(ExtendedHS2Factory.java:38)
        at com.cloudera.hive.hivecommon.core.HiveJDBCConnection.connect(HiveJDBCConnection.java:597)
        at com.cloudera.hive.jdbc.common.BaseConnectionFactory.doConnect(BaseConnectionFactory.java:219)
        at com.cloudera.hive.jdbc.common.AbstractDriver.connect(AbstractDriver.java:216)
        at java.sql.DriverManager.getConnection(DriverManager.java:664)
        at java.sql.DriverManager.getConnection(DriverManager.java:247)
        at hive2.hive.main(hive.java:30)

I used the follwoing documentation for the same from cloudera: https://www.cloudera.com/documentation/other/connectors/hive-jdbc/2-5-4.html Any help will be appreciated. I am trying the following connection string :- Connection con = DriverManager.getConnection("jdbc:hive2://server.com:12345/default;principal=hive/_org.COM","user_id","pwd");

the complete code :

package hive2;
import java.sql.SQLException;
import java.sql.Connection;
import java.sql.ResultSet;
import java.sql.Statement;


import java.sql.DriverManager;

public class hive {
  private static String driverName = "com.cloudera.hive.jdbc4.HS2Driver";

  public static void main(String[] args) throws SQLException {
    try {
      Class.forName(driverName);
    } catch (ClassNotFoundException e) {
      // TODO Auto-generated catch block
      e.printStackTrace();
      System.exit(1);
    }





   Connection con = DriverManager.getConnection("jdbc:hive2://server.com:12345/default;principal=hive/_org.COM","user_id","pwd");
    Statement stmt = con.createStatement();
    String tableName = "testHiveDriverTable";
    stmt.executeQuery("drop table " + tableName);
    ResultSet res = stmt.executeQuery("create table " + tableName + " (key int, value string)");
    // show tables
    String sql = "show tables '" + tableName + "'";
    System.out.println("Running: " + sql);
    res = stmt.executeQuery(sql);
    if (res.next()) {
      System.out.println(res.getString(1));
    }
    // describe table
    sql = "describe " + tableName;
    System.out.println("Running: " + sql);
    res = stmt.executeQuery(sql);
    while (res.next()) {
      System.out.println(res.getString(1) + "\t" + res.getString(2));
    }

    // load data into table
    // NOTE: filepath has to be local to the hive server
    // NOTE: /tmp/a.txt is a ctrl-A separated file with two fields per line
    String filepath = "/tmp/a.txt";
    sql = "load data local inpath '" + filepath + "' into table " + tableName;
    System.out.println("Running: " + sql);
    res = stmt.executeQuery(sql);

    // select * query
    sql = "select * from " + tableName;
    System.out.println("Running: " + sql);
    res = stmt.executeQuery(sql);
    while (res.next()) {
      System.out.println(String.valueOf(res.getInt(1)) + "\t" + res.getString(2));
    }

    // regular hive query
    sql = "select count(1) from " + tableName;
    System.out.println("Running: " + sql);
    res = stmt.executeQuery(sql);
    while (res.next()) {
      System.out.println(res.getString(1));
    }
  }
}
Kritz
  • 89
  • 2
  • 13
  • You used the syntax for the **Apache** JDBC driver on the **Cloudera** driver!! – Samson Scharfrichter Aug 21 '17 at 14:43
  • The Cloudera drivers ship with a 80+ pages manual. Read it. – Samson Scharfrichter Aug 21 '17 at 14:44
  • Are you trying to connect to a secured cluster ? Share the code snippet you are using – SachinJose Aug 22 '17 at 00:56
  • @sachin I have edited the post. Please take a look at it – Kritz Aug 23 '17 at 06:04
  • @SamsonScharfrichter the code I am using is from the manual only.. but still I'll give it a read again :) thank you – Kritz Aug 23 '17 at 06:05
  • BTW the Cloudera driver is a Type 4.x, you don't need to load the class explicitly, it is registered automatically when the JVM parses its CLASSPATH _(in Java at least - Scala & Spark don't do that cleanly)_ – Samson Scharfrichter Aug 23 '17 at 12:03
  • @SamsonScharfrichter Explicitly loading the driver shouldn't give the error. right? I have tried everything but the error is not going away :( – Kritz Aug 23 '17 at 12:22
  • 1
    The (obsolete) Cloudera documentation that you have linked is clear: if you have Kerberos authentication, then your URL must be something like `jdbc:hive2://server.domain:10000;AuthMech=1;KrbRealm=DOMAIN;KrbHostFQDN=server.domain;KrbServiceName=hive` >> the URL you are using would be valid for the Apache driver with Kerberos auth – Samson Scharfrichter Aug 23 '17 at 21:13
  • 1
    Also, User / Pwd arguments are completely ignored when using Kerberos auth.You need either a valid Kerberos ticket in the default cache, or some raw JAAS configuration and a keytab file, cf. my answer to https://stackoverflow.com/questions/42477466/error-when-connect-to-impala-with-jdbc-under-kerberos-authrication/42506620 – Samson Scharfrichter Aug 23 '17 at 21:17
  • @SamsonScharfrichter I am very new to kerberos protocol so I apologise in advance if I am asking something very basic or stupid. I have set the KrbRealm and KrbHostFQDN, so do I have to set up the cache or make the JAAS configuration? .. now I am getting this error - [Simba][HiveJDBCDriver](500310) Invalid operation: Unable to obtain Principal Name for authentication ; – Kritz Aug 28 '17 at 09:57
  • Windows or Linux? Kerberos back-end is Active Directory, OpenLDAP, dedicated MIT Kerberos? – Samson Scharfrichter Aug 28 '17 at 17:13
  • @SamsonScharfrichter windows... I was able to make the connection yesterday :D I used dedicated MIT kerberos cache.. but it can use only one user's credentials.. what if I deploy it on the server, then everybody will be using that one person's credentials .. right? – Kritz Aug 31 '17 at 07:40
  • 1
    It depends. See my answer to https://stackoverflow.com/questions/42477466/error-when-connect-to-impala-with-jdbc-under-kerberos-authrication/42506620 for a JAAS configuration that does not use a ticket cache -- but *explicitly* gives the principal to use, and its keytab. You could also prompt for a keyword, interactively, but that would require overriding some core Java Security properties, and I guess you're not ready for that. – Samson Scharfrichter Sep 02 '17 at 07:26
  • @Kritz can you post an answer on how you got it to work? – Kyle Bridenstine Dec 06 '17 at 21:39
  • 1
    @Mr.Tea The code is the same. Since it was a secured cluster it was using Kerberos authentication. So creating a ticket in the cache with the right user credentials solved the problem for me. – Kritz Dec 08 '17 at 06:03

1 Answers1

0

Driver Class in your code is ODBC driver, which cannot be used for JDBC connection to hiveserver2. Refer the below apache Hive url for JDBC connection.

Below is the sample code snippet for Hiveserver2 JDBC connection.

https://cwiki.apache.org/confluence/display/Hive/HiveClient#HiveClient-JDBC

Dependent jar are also given in the documentation, you can find these dependent jars in the hive lib path. (/opt/cloudera/CDH/hive/lib/*)

SachinJose
  • 8,462
  • 4
  • 42
  • 63
  • I have already tried this code. It gives an error. Also where can I get the required jars if I don't have hive installed on my local system? – Kritz Aug 23 '17 at 12:21