Idea by Nick Piper · · sslnifi-processornifi-streaming
In case someone else finds when using HDF 3.0.0.0 they get a ClassCastException from o.apache.nifi.controller.FlowController
2017-07-19 13:48:57,498 WARN [Timer-Driven Process Thread-26] o.apache.nifi.controller.FlowController java.lang.ClassCastException: [B cannot be cast to java.lang.String at org.apache.http.conn.ssl.DefaultHostnameVerifier.getSubjectAltNames(DefaultHostnameVerifier.java:309) at org.apache.http.conn.ssl.DefaultHostnameVerifier.verify(DefaultHostnameVerifier.java:112) at org.apache.http.conn.ssl.DefaultHostnameVerifier.verify(DefaultHostnameVerifier.java:99) at org.apache.http.conn.ssl.SSLConnectionSocketFactory.verifyHostname(SSLConnectionSocketFactory.java:463) at org.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:397)
We found that it was due to HTTPCLIENT-1836 , and we are experimenting with downgrading the httpclient library jar file from 4.5.3 to 4.5.2
To replace the library in the NAR files, we wrote this:
#!/usr/bin/python import argparse import cStringIO import datetime import hashlib import os.path import requests import sys import syslog import urlparse import zipfile def log(msg): syslog.syslog(msg) print msg parser = argparse.ArgumentParser(description='Remove file from nar and insert new one') parser.add_argument('--new-jar-url', metavar='URL', required=True, help='URL to the new JAR file') parser.add_argument('--new-jar-sha512', metavar='SHA512', required=True, help='Checksum of the new Jar file to verify the download') parser.add_argument('--old-jar-name', metavar='FILENAME', required=True, help='Filename of the JAR file to remove') parser.add_argument('--path-to-nars', dest='path', required=True, help='Path to the NAR files to rewrite') args = parser.parse_args() log('Rewriting NAR files to remove {} and add {}'.format(args.old_jar_name, args.new_jar_url)) if args.new_jar_url.startswith("file://"): new_jar_body = open(args.new_jar_url[len("file://"):], 'rb').read() else: new_jar_body = requests.get(args.new_jar_url).content m = hashlib.sha512() m.update(new_jar_body) new_jar_hash = m.hexdigest() log('{} downloaded and hashed to {}'.format(args.new_jar_url, new_jar_hash)) if new_jar_hash != args.new_jar_sha512: log("hash of {} doesn't match (download={}, wanted={}), aborting".format(args.new_jar_url, new_jar_hash, args.new_jar_sha512)) sys.exit(2) for nar in os.listdir(args.path): if not nar.endswith('.nar'): continue changed_content = False fullpath = os.path.join(args.path, nar) m = hashlib.sha512() m.update(open(fullpath,'rb').read()) log('Possibly rewriting {}, current sha512={}'.format(fullpath, m.hexdigest())) zin = zipfile.ZipFile (fullpath, 'r') buffer = cStringIO.StringIO() zout = zipfile.ZipFile (buffer, 'w') for item in zin.infolist(): log('{} contained {}'.format(fullpath, item.filename)) if os.path.basename(item.filename) == args.old_jar_name: new_jar_filename = urlparse.urlsplit(args.new_jar_url).path.split('/')[-1] new_full_filename = "{}/{}".format(os.path.dirname(item.filename), new_jar_filename) log('Replacing {} with {}'.format(item.filename, new_full_filename)) zout.writestr(new_full_filename, new_jar_body, zipfile.ZIP_DEFLATED) changed_content = True else: zout.writestr(item, zin.read(item.filename)) zout.close() zin.close() if changed_content: tempfullpath = "{}_".format(fullpath) log('Now writing to {}'.format(tempfullpath)) open(tempfullpath, 'wb').write(buffer.getvalue()) log('Renaming {} to {}'.format(tempfullpath, fullpath)) os.rename(tempfullpath, fullpath) log('Renamed over {}'.format(fullpath)) m = hashlib.sha512() m.update(open(fullpath,'rb').read()) log('Rewritten {}, new sha512={}'.format(fullpath, m.hexdigest()))
Help us make things better. Share your great idea for improving Unity or vote for other people's.
Data Inject To Database Through Ni-Fi
UpdateAttribute processor to have failure relationship
Nifi - execute stream command as non-root user
NiFi processor to handle all DateTime formats
PutHDFS processor to connect 2 HDFS server in parallel
NiFi Groovy processor to mask data with lookup table
Xml to table (CSV) transformation with Groovy and NiFi Part 1
Xml to table (CSV) transformation with Groovy and NiFi Part 2
This website uses cookies for analytics, personalisation and advertising. To learn more or change your cookie settings, please read our Cookie Policy. By continuing to browse, you agree to our use of cookies.
HCC Guidelines | HCC FAQs | HCC Privacy Policy | Privacy Policy | Terms of Service
© 2011-2019 Hortonworks Inc. All Rights Reserved.
Hadoop, Falcon, Atlas, Sqoop, Flume, Kafka, Pig, Hive, HBase, Accumulo, Storm, Solr, Spark, Ranger, Knox, Ambari, ZooKeeper, Oozie and the Hadoop elephant logo are trademarks of the Apache Software Foundation.