Hortonworks.com
  • Explore
    • All Tags
    • All Questions
    • All Articles
    • All Ideas
    • All Repos
    • All SKB
    • All Users
    • All Badges
    • Leaderboard
  • Create
    • Ask a question
    • Create Article
    • Post Idea
    • Add Repo
  • Tracks
    • All Tracks
    • Community Help
    • Cloud & Operations
    • CyberSecurity
    • Data Ingestion & Streaming
    • Data Processing
    • Data Science & Advanced Analytics
    • Design & Architecture
    • Governance & Lifecycle
    • Hadoop Core
    • Sandbox & Learning
    • Security
    • Solutions
  • Login
HCC Hortonworks Community Connection
  • Home /
  • Security /
avatar image

Hive beeline -e in shell script

Question by bigdata.neophyte Nov 01, 2016 at 04:43 PM Hivekerberoshiveserver2beelinehive-jdbc

Hi,

Our cluster is secured using Kerberos. Now I need to run hive queries in a shell script which would be scheduled to run periodically.

In my shell script, I was thinking to use the below commands in sequence

>beeline -u"jdbc:hive2://$hive_server2:10000/$hive_db;principal=$user_principal"

>beeline -e"SHOW DATABASES"

But then I realised that once I run the beeline -u command, it would take me to the beeline shell instead of being in the bash shell. So wondering how to get this sorted out. I need to use beeline -e command, but need to connect to the cluster first using kerberos principal. Any ideas whats the best way to handle this?

FYI, we are not using Oozie, but shell script with crontab scheduling.

Thanks

Comment
Nishant Ambastha
Jinyong Yang

People who voted for this

2 Show 0
10 |6000 characters needed characters left characters exceeded
▼
  • Viewable by all users
  • Viewable by moderators
  • Viewable by moderators and the original poster
  • Advanced visibility
Viewable by all users

Up to 5 attachments (including images) can be used with a maximum of 524.3 kB each and 1.0 MB total.

6 Replies

· Add your reply
  • Sort: 
  • Votes
  • Created
  • Oldest
avatar image

Answer by Neeraj Sabharwal · Nov 01, 2016 at 05:30 PM

@bigdata.neophyte

Try this

[hdfs@ip-172-31-40-160 ~]$ cat b.sh

beeline << EOF

!connect jdbc:hive2://localhost:10000 hive hive

show tables

EOF

[hdfs@ip-172-31-40-160 ~]$ sh -x b.sh

+ beeline

Beeline version 1.2.1000.2.5.0.0-1245 by Apache Hive

beeline> !connect jdbc:hive2://localhost:10000 hive hive

Connecting to jdbc:hive2://localhost:10000

Connected to: Apache Hive (version 1.2.1000.2.5.0.0-1245)

Driver: Hive JDBC (version 1.2.1000.2.5.0.0-1245)

Transaction isolation: TRANSACTION_REPEATABLE_READ

0: jdbc:hive2://localhost:10000> show tables;

+-------------------+--+

| tab_name |

+-------------------+--+

| edl_good_data |

| td_edl_good_data |

| test |

+-------------------+--+

3 rows selected (0.152 seconds)

0: jdbc:hive2://localhost:10000> Closing: 0: jdbc:hive2://localhost:10000

[hdfs@ip-172-31-40-160 ~]$

Comment
Greg Keys
Nishant Ambastha
Jinyong Yang

People who voted for this

3 Show 2 · Share
10 |6000 characters needed characters left characters exceeded
▼
  • Viewable by all users
  • Viewable by moderators
  • Viewable by moderators and the original poster
  • Advanced visibility
Viewable by all users

Up to 5 attachments (including images) can be used with a maximum of 524.3 kB each and 1.0 MB total.

avatar image bigdata.neophyte · Nov 01, 2016 at 11:37 PM 0
Share

Thanks @Neeraj Sabharwal Just wondering if there is any standard approach to this. Without connecting how does someone use hive -e command. i.e. it looks like hive -u and hive -e commands are mutually exclusive to me.

avatar image Jinyong Yang · Jul 10, 2018 at 11:41 AM 0
Share

cool ! very helpful

avatar image

Answer by Naveen · Feb 06 at 10:30 AM

@Ken Herring I'm glad it worked, may be your cluster isn't secured. May you please accept the answer if it worked as it will help others.

Comment
Ken Herring

People who voted for this

1 Show 0 · Share
10 |6000 characters needed characters left characters exceeded
▼
  • Viewable by all users
  • Viewable by moderators
  • Viewable by moderators and the original poster
  • Advanced visibility
Viewable by all users

Up to 5 attachments (including images) can be used with a maximum of 524.3 kB each and 1.0 MB total.

avatar image

Answer by Ken Herring · Feb 04 at 03:45 PM

Thanks @Naveen

Your suggestion worked. At first, I wasn't sure about how to create the hql file. I ended up just using vi to create it with the "show partitions" statements.

Also, in my environment, I don't need to provide the credentials. I'm not sure why. I just provided the jdbc string for the zk cluster like the following:

beeline --silent=true --showHeader=false --outputformat=csv2 -u "jdbc:hive2://zk1:zkport,zk2:zkport,zk3:zkport/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2" -f show_partitions_tables.hql > show_partitions_tables.txt

Thanks for your help!

Comment

People who voted for this

0 Show 0 · Share
10 |6000 characters needed characters left characters exceeded
▼
  • Viewable by all users
  • Viewable by moderators
  • Viewable by moderators and the original poster
  • Advanced visibility
Viewable by all users

Up to 5 attachments (including images) can be used with a maximum of 524.3 kB each and 1.0 MB total.

avatar image

Answer by Naveen · Jan 18 at 02:17 PM

@Ken Herring

Hive shell has security issues & is deprecated in higher versions of hdp please avoid. Opening a hive/beeline shell for every table it will be slow as it has to spawn a jvm for every table so avoid looping approach. Prepare a file with the table list like below.

cat show_partitions_tables.hql

show partitions table1;

show partitions table2;

show partitions table3;

Use a -f flag of beeline to pass the above file eg below.

beeline --silent=true --showHeader=false --outputformat=csv2 -u "jdbc:hive2://$hive_server2:10000/$hive_db;principal=$user_principal" -n ' ' -p ' ' -f show_partitions_tables.hql > show_partitions_tables.txt

Note: Please accept the answer if it solved your issue.

Comment

People who voted for this

0 Show 0 · Share
10 |6000 characters needed characters left characters exceeded
▼
  • Viewable by all users
  • Viewable by moderators
  • Viewable by moderators and the original poster
  • Advanced visibility
Viewable by all users

Up to 5 attachments (including images) can be used with a maximum of 524.3 kB each and 1.0 MB total.

avatar image

Answer by Ken Herring · Jan 17 at 07:44 PM

Thanks @Naveen

I needed to generate a file show all of the partitions for several tables. I ended up using the following with hive.

#!/home/user/
rm -f tables.txt
rm -f partitions.txt
echo "

dbname.tablename1

dbname.tablename2

dbname.tablename3

dbname.tablename4

" >> tables.txt

wait

cat tables.txt |while read LINE
do
echo -e "$LINE\n" >> partitions.txt
hive -e "show partitions $LINE" >>partitions.txt
echo -e "\n" >> partitions.txt
done
rm -f tables.txt
echo "Show Partitions generated"

It seemed like hive didn't like something cause it threw a couple of errors at the beginning and end. But, ended up generating the desired output.

I'd prefer to use beeline but couldn't figure out how to integrate the beeline -u connection with the looping through the list of tables.

Can anyone suggest a similar approach using beeline?

Comment

People who voted for this

0 Show 1 · Share
10 |6000 characters needed characters left characters exceeded
▼
  • Viewable by all users
  • Viewable by moderators
  • Viewable by moderators and the original poster
  • Advanced visibility
Viewable by all users

Up to 5 attachments (including images) can be used with a maximum of 524.3 kB each and 1.0 MB total.

avatar image Naveen · Jan 21 at 10:37 AM 0
Share

@Ken Herring Did you try the above suggested approach, let me know if you have issues.

avatar image

Answer by Naveen · Jan 17 at 10:18 AM

Although this is an old thread just thought it will useful for someone, we can use both in same line not sure if the older version of beeline didn't support this considering this is a old thread. Anyways the below syntax works for me.

beeline -u"jdbc:hive2://$hive_server2:10000/$hive_db;principal=$user_principal -e"SHOW DATABASES"

Comment

People who voted for this

0 Show 0 · Share
10 |6000 characters needed characters left characters exceeded
▼
  • Viewable by all users
  • Viewable by moderators
  • Viewable by moderators and the original poster
  • Advanced visibility
Viewable by all users

Up to 5 attachments (including images) can be used with a maximum of 524.3 kB each and 1.0 MB total.

Your answer

Hint: You can notify a user about this post by typing @username

Up to 5 attachments (including images) can be used with a maximum of 524.3 kB each and 1.0 MB total.

60
Followers

Answers Answer & comments

This website uses cookies for analytics, personalisation and advertising. To learn more or change your cookie settings, please read our Cookie Policy. By continuing to browse, you agree to our use of cookies.

HCC Guidelines | HCC FAQs | HCC Privacy Policy | Privacy Policy | Terms of Service

© 2011-2019 Hortonworks Inc. All Rights Reserved.

Hadoop, Falcon, Atlas, Sqoop, Flume, Kafka, Pig, Hive, HBase, Accumulo, Storm, Solr, Spark, Ranger, Knox, Ambari, ZooKeeper, Oozie and the Hadoop elephant logo are trademarks of the Apache Software Foundation.

  • Anonymous
  • Login
  • Create
  • Ask a question
  • Create Article
  • Post Idea
  • Add Repo
  • Create SupportKB
  • Tracks
  • Community Help
  • Cloud & Operations
  • CyberSecurity
  • Data Ingestion & Streaming
  • Data Processing
  • Data Science & Advanced Analytics
  • Design & Architecture
  • Governance & Lifecycle
  • Hadoop Core
  • Sandbox & Learning
  • Security
  • Solutions
  • Explore
  • All Tags
  • All Questions
  • All Articles
  • All Ideas
  • All Repos
  • All SKB
  • All Users
  • Leaderboard
  • All Badges